Sample records for evaluation general approach

  1. A Rationale for Participant Evaluation

    ERIC Educational Resources Information Center

    Boody, Robert M.

    2009-01-01

    There are many different models or approaches to doing program evaluation. Fitzpatrick, Sanders, and Worthen classify them into five general approaches: (a) objectives oriented, (b) management oriented, (c) consumer oriented, (d) expertise oriented, and (e) participant oriented. Within each of these general categories, of course, reside many…

  2. A Survey of Model Evaluation Approaches with a Tutorial on Hierarchical Bayesian Methods

    ERIC Educational Resources Information Center

    Shiffrin, Richard M.; Lee, Michael D.; Kim, Woojae; Wagenmakers, Eric-Jan

    2008-01-01

    This article reviews current methods for evaluating models in the cognitive sciences, including theoretically based approaches, such as Bayes factors and minimum description length measures; simulation approaches, including model mimicry evaluations; and practical approaches, such as validation and generalization measures. This article argues…

  3. An assessment of national risk: General concepts and overall approach. [carbon fiber utilization in commercial aviation

    NASA Technical Reports Server (NTRS)

    Kalelkar, A. S.

    1979-01-01

    The analysis of risk presented by carbon fiber utilization in commercial aviation is reported. The discussion is presented in three parts: (1) general concepts; (2) overall approach; and (3) risk evaluation and perspective.

  4. A Nonrandomized Evaluation of a Brief Nurtured Heart Approach Parent Training Program

    ERIC Educational Resources Information Center

    Brennan, Alison L.; Hektner, Joel M.; Brotherson, Sean E.; Hansen, Tracy M.

    2016-01-01

    Background: Parent training programs are increasingly being offered to the general public with little formal evaluation of their effects. One such program, the Nurtured Heart Approach to parenting (NHA; Glasser and Easley in "Transforming the difficult child: The Nurtured Heart Approach," Vaughan Printing, Nashville, 2008), contains…

  5. [Qualitative Research in Health Services Research - Discussion Paper, Part 3: Quality of Qualitative Research].

    PubMed

    Stamer, M; Güthlin, C; Holmberg, C; Karbach, U; Patzelt, C; Meyer, T

    2015-12-01

    The third and final discussion paper of the German Network of Health Services Research's (DNVF) "Qualitative Methods Working Group" demonstrates methods for the evaluation and quality of qualitative research in health services research. In this paper we discuss approaches described in evaluating qualitative studies, including: an orientation to the general principles of empirical research, an approach-specific course of action, as well as procedures based on the research-process and criteria-oriented approaches. Divided into general and specific aspects to be considered in a qualitative study quality evaluation, the central focus of the discussion paper undertakes an extensive examination of the process and criteria-oriented approaches. The general aspects include the participation of relevant groups in the research process as well as ethical aspects of the research and data protection issues. The more specific aspects in evaluating the quality of qualitative research include considerations about the research interest, research questions, and the selection of data collection methods and types of analyses. The formulated questions are intended to guide reviewers and researchers to evaluate and to develop qualitative research projects appropriately. The intention of this discussion paper is to ensure a transparent research culture, and to reflect on and discuss the methodological and research approach of qualitative studies in health services research. With this paper we aim to initiate a discussion on high quality evaluation of qualitative health services research. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Effect of Guided Collaboration on General and Special Educators' Perceptions of Collaboration and Student Achievement

    ERIC Educational Resources Information Center

    Laine, Sandra

    2013-01-01

    This study investigated the effects of a guided collaboration approach during professional learning community meetings (PLC's) on the perceptions of general and special educators as well as the effect on student performance as measured by benchmark evaluation. A mixed methodology approach was used to collect data through surveys, weekly…

  7. A Design Taxonomy Utilizing Ten Major Evaluation Strategies.

    ERIC Educational Resources Information Center

    Willis, Barry

    This paper discusses ten evaluation strategies selected on the basis of their general acceptance and their relatively unique approach to the field: (1) State, "Countenance of Evaluation"; (2) Stufflebeam, "Decision Centered Evaluation (CIPP)"; (3) Provus, "Discrepancy Evaluation"; (4) Scriven, "Goal Free Evaluation"; (5) Scriven, "Formative and…

  8. Sediment Toxicity Identification Evaluation

    EPA Science Inventory

    Approach combining chemical manipulations and aquatic toxicity testing, generally with whole organisms, to systematically characterize, identify and confirm toxic substances causing toxicity in whole sediments and sediment interstitial waters. The approach is divided into thre...

  9. A 2D MTF approach to evaluate and guide dynamic imaging developments.

    PubMed

    Chao, Tzu-Cheng; Chung, Hsiao-Wen; Hoge, W Scott; Madore, Bruno

    2010-02-01

    As the number and complexity of partially sampled dynamic imaging methods continue to increase, reliable strategies to evaluate performance may prove most useful. In the present work, an analytical framework to evaluate given reconstruction methods is presented. A perturbation algorithm allows the proposed evaluation scheme to perform robustly without requiring knowledge about the inner workings of the method being evaluated. A main output of the evaluation process consists of a two-dimensional modulation transfer function, an easy-to-interpret visual rendering of a method's ability to capture all combinations of spatial and temporal frequencies. Approaches to evaluate noise properties and artifact content at all spatial and temporal frequencies are also proposed. One fully sampled phantom and three fully sampled cardiac cine datasets were subsampled (R = 4 and 8) and reconstructed with the different methods tested here. A hybrid method, which combines the main advantageous features observed in our assessments, was proposed and tested in a cardiac cine application, with acceleration factors of 3.5 and 6.3 (skip factors of 4 and 8, respectively). This approach combines features from methods such as k-t sensitivity encoding, unaliasing by Fourier encoding the overlaps in the temporal dimension-sensitivity encoding, generalized autocalibrating partially parallel acquisition, sensitivity profiles from an array of coils for encoding and reconstruction in parallel, self, hybrid referencing with unaliasing by Fourier encoding the overlaps in the temporal dimension and generalized autocalibrating partially parallel acquisition, and generalized autocalibrating partially parallel acquisition-enhanced sensitivity maps for sensitivity encoding reconstructions.

  10. Training Evaluation: An Analysis of the Stakeholders' Evaluation Needs

    ERIC Educational Resources Information Center

    Guerci, Marco; Vinante, Marco

    2011-01-01

    Purpose: In recent years, the literature on program evaluation has examined multi-stakeholder evaluation, but training evaluation models and practices have not generally taken this problem into account. The aim of this paper is to fill this gap. Design/methodology/approach: This study identifies intersections between methodologies and approaches…

  11. The approach of general surgeons to oncoplastic and reconstructive breast surgery in Turkey: a survey of practice patterns.

    PubMed

    Emiroğlu, Mustafa; Sert, İsmail; İnal, Abdullah; Karaali, Cem; Peker, Kemal; İlhan, Enver; Gülcelik, Mehmet; Erol, Varlık; Güngör, Hilmi; Can, Didem; Aydın, Cengiz

    2014-12-01

    Oncoplastic Breast Surgery (OBS), which is a combination of oncological procedures and plastic surgery techniques, has recently gained widespread use. To assess the experiences, practice patterns and preferred approaches to Oncoplastic and Reconstructive Breast Surgery (ORBS) undertaken by general surgeons specializing in breast surgery in Turkey. Cross-sectional study. Between December 2013 and February 2014, an eleven-question survey was distributed among 208 general surgeons specializing in breast surgery. The questions focused on the attitudes of general surgeons toward performing oncoplastic breast surgery (OBS), the role of the general surgeon in OBS and their training for it as well as their approaches to evaluating cosmetic outcomes in Breast Conserving Surgery (BCS) and informing patients about ORBS preoperatively. Responses from all 208 surgeons indicated that 79.8% evaluated the cosmetic outcomes of BCS, while 94.2% informed their patients preoperatively about ORBS. 52.5% performed BCS (31.3% themselves, 21.1% together with a plastic surgeon). 53.8% emphasized that general surgeons should carry out OBS themselves. 36.1% of respondents suggested that OBS training should be included within mainstream surgical training, whereas 27.4% believed this training should be conducted by specialised centres. Although OBS procedure rates are low in Turkey, it is encouraging to see general surgeons practicing ORBS themselves. The survey demonstrates that our general surgeons aspire to learn and utilize OBS techniques.

  12. Application of SIGGS to Project PRIME: A General Systems Approach to Evaluation of Mainstreaming.

    ERIC Educational Resources Information Center

    Frick, Ted

    The use of the systems approach in educational inquiry is not new, and the models of input/output, input/process/product, and cybernetic systems have been widely used. The general systems model is an extension of all these, adding the dimension of environmental influence on the system as well as system influence on the environment. However, if the…

  13. Should the capability approach be applied in health economics?

    PubMed

    Coast, Joanna; Smith, Richard; Lorgelly, Paula

    2008-06-01

    This editorial questions the implications of the capability approach for health economics. Two specific issues are considered: the evaluative space of capablities (as opposed to health or utility) and the decision-making principle of maximisation. The paper argues that the capability approach can provide a richer evaluative space enabling improved evaluation of many interventions. It also argues that more thought is needed about the decision-making principles both within the capability approach and within health economics more generally. Specifically, researchers should analyse equity-oriented principles such as equalisation and a 'decent minimum' of capability, rather than presuming that the goal must be the maximisation of capability.

  14. 77 FR 19678 - Cooperative Research and Development Agreement: Asset Tracking and Reporting Technology

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... Cooperative Research and Development Agreement (CRADA) with General Dynamics C4 Systems, Inc. (General Dynamics), to test, evaluate, and document the strengths and weaknesses of at least one technical approach... Guard is currently considering partnering with General Dynamics, we are soliciting public comment on the...

  15. Assessing safety of extractables from materials and leachables in pharmaceuticals and biologics - Current challenges and approaches.

    PubMed

    Broschard, Thomas H; Glowienke, Susanne; Bruen, Uma S; Nagao, Lee M; Teasdale, Andrew; Stults, Cheryl L M; Li, Kim L; Iciek, Laurie A; Erexson, Greg; Martin, Elizabeth A; Ball, Douglas J

    2016-11-01

    Leachables from pharmaceutical container closure systems can present potential safety risks to patients. Extractables studies may be performed as a risk mitigation activity to identify potential leachables for dosage forms with a high degree of concern associated with the route of administration. To address safety concerns, approaches to toxicological safety evaluation of extractables and leachables have been developed and applied by pharmaceutical and biologics manufacturers. Details of these approaches may differ depending on the nature of the final drug product. These may include application, the formulation, route of administration and length of use. Current regulatory guidelines and industry standards provide general guidance on compound specific safety assessments but do not provide a comprehensive approach to safety evaluations of leachables and/or extractables. This paper provides a perspective on approaches to safety evaluations by reviewing and applying general concepts and integrating key steps in the toxicological evaluation of individual extractables or leachables. These include application of structure activity relationship studies, development of permitted daily exposure (PDE) values, and use of safety threshold concepts. Case studies are provided. The concepts presented seek to encourage discussion in the scientific community, and are not intended to represent a final opinion or "guidelines." Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Comparing Methods for UAV-Based Autonomous Surveillance

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Harris, Robert; Shafto, Michael

    2004-01-01

    We describe an approach to evaluating algorithmic and human performance in directing UAV-based surveillance. Its key elements are a decision-theoretic framework for measuring the utility of a surveillance schedule and an evaluation testbed consisting of 243 scenarios covering a well-defined space of possible missions. We apply this approach to two example UAV-based surveillance methods, a TSP-based algorithm and a human-directed approach, then compare them to identify general strengths, and weaknesses of each method.

  17. A stochastic approach for automatic generation of urban drainage systems.

    PubMed

    Möderl, M; Butler, D; Rauch, W

    2009-01-01

    Typically, performance evaluation of new developed methodologies is based on one or more case studies. The investigation of multiple real world case studies is tedious and time consuming. Moreover extrapolating conclusions from individual investigations to a general basis is arguable and sometimes even wrong. In this article a stochastic approach is presented to evaluate new developed methodologies on a broader basis. For the approach the Matlab-tool "Case Study Generator" is developed which generates a variety of different virtual urban drainage systems automatically using boundary conditions e.g. length of urban drainage system, slope of catchment surface, etc. as input. The layout of the sewer system is based on an adapted Galton-Watson branching process. The sub catchments are allocated considering a digital terrain model. Sewer system components are designed according to standard values. In total, 10,000 different virtual case studies of urban drainage system are generated and simulated. Consequently, simulation results are evaluated using a performance indicator for surface flooding. Comparison between results of the virtual and two real world case studies indicates the promise of the method. The novelty of the approach is that it is possible to get more general conclusions in contrast to traditional evaluations with few case studies.

  18. The Ongoing Evaluation of Protein-Energy Malnutrition.

    ERIC Educational Resources Information Center

    Hennart, Philippe; And Others

    1984-01-01

    This report describes an approach for the evaluation of nutritional status implemented by the CEMUBAC medical mission to Zaire. Introductory remarks provide a brief, general discussion of the evaluation of individual, familial, and community nutritional status, as well as the evaluation of nutritional status and priorities. Section I focuses on…

  19. Two-stage Bayesian model to evaluate the effect of air pollution on chronic respiratory diseases using drug prescriptions.

    PubMed

    Blangiardo, Marta; Finazzi, Francesco; Cameletti, Michela

    2016-08-01

    Exposure to high levels of air pollutant concentration is known to be associated with respiratory problems which can translate into higher morbidity and mortality rates. The link between air pollution and population health has mainly been assessed considering air quality and hospitalisation or mortality data. However, this approach limits the analysis to individuals characterised by severe conditions. In this paper we evaluate the link between air pollution and respiratory diseases using general practice drug prescriptions for chronic respiratory diseases, which allow to draw conclusions based on the general population. We propose a two-stage statistical approach: in the first stage we specify a space-time model to estimate the monthly NO2 concentration integrating several data sources characterised by different spatio-temporal resolution; in the second stage we link the concentration to the β2-agonists prescribed monthly by general practices in England and we model the prescription rates through a small area approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Alternative approaches to the taxation of heavy vehicles

    DOT National Transportation Integrated Search

    1998-01-01

    This report contains recommendations that are applicable to federal and state governments for evaluating alternatives to the taxation of heavy vehicles. An evaluation procedure and general assessments and recommendations on future activities are pres...

  1. Cultural Competence Approaches to Evaluation in Tribal Communities.

    ERIC Educational Resources Information Center

    Running Wolf, Paulette; Soler, Robin; Manteuffel, Brigitte; Sondheimer, Diane; Santiago, Rolando L.; Erickson, Jill Shephard

    Disability research and program evaluation have generally been viewed with suspicion in Indian country because research designs, procedures, instruments, and interpretation and dissemination of outcomes have often ignored potential cultural conflicts. This paper explores a national evaluation in eight tribal communities that established systems of…

  2. The Approach of General Surgeons to Oncoplastic and Reconstructive Breast Surgery in Turkey: A Survey of Practice Patterns

    PubMed Central

    Emiroğlu, Mustafa; Sert, İsmail; İnal, Abdullah; Karaali, Cem; Peker, Kemal; İlhan, Enver; Gülcelik, Mehmet; Erol, Varlık; Güngör, Hilmi; Can, Didem; Aydın, Cengiz

    2014-01-01

    Background: Oncoplastic Breast Surgery (OBS), which is a combination of oncological procedures and plastic surgery techniques, has recently gained widespread use. Aims: To assess the experiences, practice patterns and preferred approaches to Oncoplastic and Reconstructive Breast Surgery (ORBS) undertaken by general surgeons specializing in breast surgery in Turkey. Study Design: Cross-sectional study. Methods: Between December 2013 and February 2014, an eleven-question survey was distributed among 208 general surgeons specializing in breast surgery. The questions focused on the attitudes of general surgeons toward performing oncoplastic breast surgery (OBS), the role of the general surgeon in OBS and their training for it as well as their approaches to evaluating cosmetic outcomes in Breast Conserving Surgery (BCS) and informing patients about ORBS preoperatively. Results: Responses from all 208 surgeons indicated that 79.8% evaluated the cosmetic outcomes of BCS, while 94.2% informed their patients preoperatively about ORBS. 52.5% performed BCS (31.3% themselves, 21.1% together with a plastic surgeon). 53.8% emphasized that general surgeons should carry out OBS themselves. 36.1% of respondents suggested that OBS training should be included within mainstream surgical training, whereas 27.4% believed this training should be conducted by specialised centres. Conclusion: Although OBS procedure rates are low in Turkey, it is encouraging to see general surgeons practicing ORBS themselves. The survey demonstrates that our general surgeons aspire to learn and utilize OBS techniques. PMID:25667784

  3. Real-time simulation of biological soft tissues: a PGD approach.

    PubMed

    Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F

    2013-05-01

    We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Evaluating Environmental Education in Schools. A Practical Guide for Teachers. Environmental Education Series 12.

    ERIC Educational Resources Information Center

    Bennett, Dean B.

    A general approach to environmental education evaluation and practical knowledge about the area of educational evaluation are offered in this teacher's guide. An introductory section explains both the use of the guide and use of a four step evaluation process. Practical aspects of evaluation are highlighted in six chapters through specific…

  5. Evaluation of natural language processing systems: Issues and approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guida, G.; Mauri, G.

    This paper encompasses two main topics: a broad and general analysis of the issue of performance evaluation of NLP systems and a report on a specific approach developed by the authors and experimented on a sample test case. More precisely, it first presents a brief survey of the major works in the area of NLP systems evaluation. Then, after introducing the notion of the life cycle of an NLP system, it focuses on the concept of performance evaluation and analyzes the scope and the major problems of the investigation. The tools generally used within computer science to assess the qualitymore » of a software system are briefly reviewed, and their applicability to the task of evaluation of NLP systems is discussed. Particular attention is devoted to the concepts of efficiency, correctness, reliability, and adequacy, and how all of them basically fail in capturing the peculiar features of performance evaluation of an NLP system is discussed. Two main approaches to performance evaluation are later introduced; namely, black-box- and model-based, and their most important characteristics are presented. Finally, a specific model for performance evaluation proposed by the authors is illustrated, and the results of an experiment with a sample application are reported. The paper concludes with a discussion on research perspective, open problems, and importance of performance evaluation to industrial applications.« less

  6. Evaluating Training and Implementation of the Individualized Meaning-Centered Approach to Teaching Braille Literacy

    ERIC Educational Resources Information Center

    Durando, Julie A.; Wormsley, Diane P.

    2009-01-01

    This study investigated the effectiveness of training workshops in braille literacy for teachers of students who are visually impaired and have additional disabilities. Participants in the training workshops in the Individualized Meaning-Centered Approach indicated general satisfaction with the training. Most reported using the approach with…

  7. Pour Savoir ou vous en etes. Evaluez vous-meme vos connaissances (Knowing Where You Are. Evaluate Your Knowledge by Yourself).

    ERIC Educational Resources Information Center

    Aiello, Angelo; And Others

    1986-01-01

    A form is presented for language teacher self-evaluation concerning attitudes and knowledge about learning theories, general linguistics, sociolinguistics, pragmatics, discourse analysis, teaching methodology, the communicative approach, class activities, class management, instructional support, and evaluation. (MSE)

  8. Evaluation of environmental aspects significance in ISO 14001.

    PubMed

    Põder, Tõnis

    2006-05-01

    The methodological framework set by standards ISO 14001 and ISO 14004 gives only general principles for environmental aspects assessment, which is regarded as one of the most critical stages of implementing environmental management system. In Estonia, about 100 organizations have been certified to the ISO 14001. Experience obtained from numerous companies has demonstrated that limited transparency and reproducibility of the assessment process serves as a common shortcoming. Despite rather complicated assessment schemes sometimes used, the evaluation procedures have been largely based on subjective judgments because of ill-defined and inadequate assessment criteria. A comparison with some similar studies in other countries indicates a general nature of observed inconsistencies. The diversity of approaches to the aspects' assessment in concept literature and to the related problems has been discussed. The general structure of basic assessment criteria, compatible with environmental impact assessment and environmental risk analysis has also been outlined. Based on this general structure, the article presents a tiered approach to help organize the assessment in a more consistent manner.

  9. Comparative Properties of Collaborative Optimization and Other Approaches to MDO

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    1999-01-01

    We, discuss criteria by which one can classify, analyze, and evaluate approaches to solving multidisciplinary design optimization (MDO) problems. Central to our discussion is the often overlooked distinction between questions of formulating MDO problems and solving the resulting computational problem. We illustrate our general remarks by comparing several approaches to MDO that have been proposed.

  10. Comparative Properties of Collaborative Optimization and other Approaches to MDO

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    1999-01-01

    We discuss criteria by which one can classify, analyze, and evaluate approaches to solving multidisciplinary design optimization (MDO) problems. Central to our discussion is the often overlooked distinction between questions of formulating MDO problems and solving the resulting computational problem. We illustrate our general remarks by comparing several approaches to MDO that have been proposed.

  11. Can Rational Prescribing Be Improved by an Outcome-Based Educational Approach? A Randomized Trial Completed in Iran

    ERIC Educational Resources Information Center

    Esmaily, Hamideh M.; Silver, Ivan; Shiva, Shadi; Gargani, Alireza; Maleki-Dizaji, Nasrin; Al-Maniri, Abdullah; Wahlstrom, Rolf

    2010-01-01

    Introduction: An outcome-based education approach has been proposed to develop more effective continuing medical education (CME) programs. We have used this approach in developing an outcome-based educational intervention for general physicians working in primary care (GPs) and evaluated its effectiveness compared with a concurrent CME program in…

  12. Gold-standard evaluation of a folksonomy-based ontology learning model

    NASA Astrophysics Data System (ADS)

    Djuana, E.

    2018-03-01

    Folksonomy, as one result of collaborative tagging process, has been acknowledged for its potential in improving categorization and searching of web resources. However, folksonomy contains ambiguities such as synonymy and polysemy as well as different abstractions or generality problem. To maximize its potential, some methods for associating tags of folksonomy with semantics and structural relationships have been proposed such as using ontology learning method. This paper evaluates our previous work in ontology learning according to gold-standard evaluation approach in comparison to a notable state-of-the-art work and several baselines. The results show that our method is comparable to the state-of the art work which further validate our approach as has been previously validated using task-based evaluation approach.

  13. Predicting general criminal recidivism in mentally disordered offenders using a random forest approach.

    PubMed

    Pflueger, Marlon O; Franke, Irina; Graf, Marc; Hachtel, Henning

    2015-03-29

    Psychiatric expert opinions are supposed to assess the accused individual's risk of reoffending based on a valid scientific foundation. In contrast to specific recidivism, general recidivism has only been poorly considered in Continental Europe; we therefore aimed to develop a valid instrument for assessing the risk of general criminal recidivism of mentally ill offenders. Data of 259 mentally ill offenders with a median time at risk of 107 months were analyzed and combined with the individuals' criminal records. We derived risk factors for general criminal recidivism and classified re-offences by using a random forest approach. In our sample of mentally ill offenders, 51% were reconvicted. The most important predictive factors for general criminal recidivism were: number of prior convictions, age, type of index offence, diversity of criminal history, and substance abuse. With our statistical approach we were able to correctly identify 58-95% of all reoffenders and 65-97% of all committed offences (AUC = .90). Our study presents a new statistical approach to forensic-psychiatric risk-assessment, allowing experts to evaluate general risk of reoffending in mentally disordered individuals, with a special focus on high-risk groups. This approach might serve not only for expert opinions in court, but also for risk management strategies and therapeutic interventions.

  14. Safety and Suitability for Service Assessment Testing for Surface and Underwater Launched Munitions

    DTIC Science & Technology

    2014-12-05

    test efficiency that tend to associate the Analytical S3 Test Approach with large, complex munition systems and the Empirical S3 Test Approach with...the smaller, less complex munition systems . 8.1 ANALYTICAL S3 TEST APPROACH. The Analytical S3 test approach, as shown in Figure 3, evaluates...assets than the Analytical S3 Test approach to establish the safety margin of the system . This approach is generally applicable to small munitions

  15. All-paths graph kernel for protein-protein interaction extraction with evaluation of cross-corpus learning.

    PubMed

    Airola, Antti; Pyysalo, Sampo; Björne, Jari; Pahikkala, Tapio; Ginter, Filip; Salakoski, Tapio

    2008-11-19

    Automated extraction of protein-protein interactions (PPI) is an important and widely studied task in biomedical text mining. We propose a graph kernel based approach for this task. In contrast to earlier approaches to PPI extraction, the introduced all-paths graph kernel has the capability to make use of full, general dependency graphs representing the sentence structure. We evaluate the proposed method on five publicly available PPI corpora, providing the most comprehensive evaluation done for a machine learning based PPI-extraction system. We additionally perform a detailed evaluation of the effects of training and testing on different resources, providing insight into the challenges involved in applying a system beyond the data it was trained on. Our method is shown to achieve state-of-the-art performance with respect to comparable evaluations, with 56.4 F-score and 84.8 AUC on the AImed corpus. We show that the graph kernel approach performs on state-of-the-art level in PPI extraction, and note the possible extension to the task of extracting complex interactions. Cross-corpus results provide further insight into how the learning generalizes beyond individual corpora. Further, we identify several pitfalls that can make evaluations of PPI-extraction systems incomparable, or even invalid. These include incorrect cross-validation strategies and problems related to comparing F-score results achieved on different evaluation resources. Recommendations for avoiding these pitfalls are provided.

  16. A topological approach unveils system invariances and broken symmetries in the brain.

    PubMed

    Tozzi, Arturo; Peters, James F

    2016-05-01

    Symmetries are widespread invariances underscoring countless systems, including the brain. A symmetry break occurs when the symmetry is present at one level of observation but is hidden at another level. In such a general framework, a concept from algebraic topology, namely, the Borsuk-Ulam theorem (BUT), comes into play and sheds new light on the general mechanisms of nervous symmetries. The BUT tells us that we can find, on an n-dimensional sphere, a pair of opposite points that have the same encoding on an n - 1 sphere. This mapping makes it possible to describe both antipodal points with a single real-valued vector on a lower dimensional sphere. Here we argue that this topological approach is useful for the evaluation of hidden nervous symmetries. This means that symmetries can be found when evaluating the brain in a proper dimension, although they disappear (are hidden or broken) when we evaluate the same brain only one dimension lower. In conclusion, we provide a topological methodology for the evaluation of the most general features of brain activity, i.e., the symmetries, cast in a physical/biological fashion that has the potential to be operationalized. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  17. Evaluating Social Programs at the State and Local Level. The JTPA Evaluation Design Project.

    ERIC Educational Resources Information Center

    Blalock, Ann Bonar, Ed.; And Others

    This book on evaluating social programs is an outcome of the Job Training Partnership Act (JTPA) Evaluation Design Project, which produced a set of 10 guides for the evaluation of state and local JTPA programs. This book distills ideas from these guides and applies them to a larger context. Part 1 presents a general approach to program evaluation…

  18. A systems modeling methodology for evaluation of vehicle aggressivity in the automotive accident environment

    DOT National Transportation Integrated Search

    2001-03-05

    A systems modeling approach is presented for assessment of harm in the automotive accident environment. The methodology is presented in general form and then applied to evaluate vehicle aggressivity in frontal crashes. The methodology consists of par...

  19. Analytic evaluation of the weighting functions for remote sensing of blackbody planetary atmospheres : the case of limb viewing geometry

    NASA Technical Reports Server (NTRS)

    Ustinov, Eugene A.

    2006-01-01

    In a recent publication (Ustinov, 2002), we proposed an analytic approach to evaluation of radiative and geophysical weighting functions for remote sensing of a blackbody planetary atmosphere, based on general linearization approach applied to the case of nadir viewing geometry. In this presentation, the general linearization approach is applied to the limb viewing geometry. The expressions, similar to those obtained in (Ustinov, 2002), are obtained for weighting functions with respect to the distance along the line of sight. Further on, these expressions are converted to the expressions for weighting functions with respect to the vertical coordinate in the atmosphere. Finally, the numerical representation of weighting functions in the form of matrices of partial derivatives of grid limb radiances with respect to the grid values of atmospheric parameters is used for a convolution with the finite field of view of the instrument.

  20. Issues in evaluation: evaluating assessments of elderly people using a combination of methods.

    PubMed

    McEwan, R T

    1989-02-01

    In evaluating a health service, individuals will give differing accounts of its performance, according to their experiences of the service, and the evaluative perspective they adopt. The value of a service may also change through time, and according to the particular part of the service studied. Traditional health care evaluations have generally not accounted for this variability because of the approaches used. Studies evaluating screening or assessment programmes for the elderly have focused on programme effectiveness and efficiency, using relatively inflexible quantitative methods. Evaluative approaches must reflect the complexity of health service provision, and methods must vary to suit the particular research objective. Under these circumstances, this paper presents the case for the use of multiple triangulation in evaluative research, where differing methods and perspectives are combined in one study. Emphasis is placed on the applications and benefits of subjectivist approaches in evaluation. An example of combined methods is provided in the form of an evaluation of the Newcastle Care Plan for the Elderly.

  1. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  2. AIBS Education Review, Volume 2 Number 2.

    ERIC Educational Resources Information Center

    Creager, Joan G., Ed.

    A review of some recent educational and research activities is presented in this publication. Major articles compiled in this review include: An Innovative Approach to Laboratory Instruction; An Evaluation of the Mastery Strategy for General Biology Students, Food Science as a General Education Course in Biological Science; The Phase Achievement…

  3. Evaluation of DNA mixtures from database search.

    PubMed

    Chung, Yuk-Ka; Hu, Yue-Qing; Fung, Wing K

    2010-03-01

    With the aim of bridging the gap between DNA mixture analysis and DNA database search, a novel approach is proposed to evaluate the forensic evidence of DNA mixtures when the suspect is identified by the search of a database of DNA profiles. General formulae are developed for the calculation of the likelihood ratio for a two-person mixture under general situations including multiple matches and imperfect evidence. The influence of the prior probabilities on the weight of evidence under the scenario of multiple matches is demonstrated by a numerical example based on Hong Kong data. Our approach is shown to be capable of presenting the forensic evidence of DNA mixtures in a comprehensive way when the suspect is identified through database search.

  4. 48 CFR 5215.605 - Evaluation factors.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... appropriate premiums for measured increments of quality. When a cost/benefit approach is used, cost must carry... recent Forward Pricing Rate Agreements (FPRAs)). (iv) Cost realism evaluation generally will be performed... realism data are required, the contracting officer shall not request a formal field pricing report but...

  5. A Theoretical and Methodological Evaluation of Leadership Research.

    ERIC Educational Resources Information Center

    Lashbrook, Velma J.; Lashbrook, William B.

    This paper isolates some of the strengths and weaknesses of leadership research by evaluating it from both a theoretical and methodological perspective. The seven theories or approaches examined are: great man, trait, situational, style, functional, social influence, and interaction positions. General theoretical, conceptual, and measurement…

  6. Economic evaluation of HCV testing approaches in low and middle income countries.

    PubMed

    Morgan, Jake R; Servidone, Maria; Easterbrook, Philippa; Linas, Benjamin P

    2017-11-01

    Hepatitis C virus (HCV) infection represents a major public health burden with diverse epidemics worldwide, but at present, only a minority of infected persons have been tested and are aware of their diagnosis. The advent of highly effective direct acting antiviral (DAA) therapy, which is becoming available at increasingly lower costs in low and middle income countries (LMICs), represents a major opportunity to expand access to testing and treatment. However, there is uncertainty as to the optimal testing approaches and who to prioritize for testing. We undertook a narrative review of the cost-effectiveness literature on different testing approaches for chronic hepatitis C infection to inform decision-making and formulation of recommendations in the 2017 World Health Organization (WHO) viral hepatitis testing guidelines. We undertook a focused search and narrative review of the literature for cost effectiveness studies of testing approaches in three main groups:- 1) focused testing of specific high-risk groups (defined as those who are part of a population with higher seroprevalence or who have a history of exposure or high-risk behaviours); 2) "birth cohort" testing among easily identified age groups (i.e. specific birth cohorts) known to have a high prevalence of HCV infection; and 3) routine testing in the general population. Articles included were those published in PubMed, written in English and published after 2000. We identified 26 eligible studies. Twenty-four of them were from Europe (n = 14) or the United States (n = 10). There was only one study from a LMIC (Egypt) and this evaluated general population testing. Thirteen studies evaluated focused testing among specific groups at high risk for HCV infection, including nine in persons who inject drugs (PWID); five among people in prison, and one among HIV-infected men who have sex with men (MSM). Eight studies evaluated birth cohort testing, and five evaluated testing in the general population. Most studies were based on a one-time testing intervention, but in one study testing was undertaken every 5 years and in another among HIV-infected MSM there was more frequent testing. Comparators were generally either: 1) no testing, 2) the status quo, or 3) multiple different strategies. Overall, we found broad agreement that focused testing of high risk groups such as persons who inject drugs and men who have sex with men was cost-effective, as was birth cohort testing. Key drivers of cost-effectiveness were the prevalence of HCV infection in these groups, efficacy and cost of treatment, stage of disease and linkage to care. The evidence for routine population testing was mixed, and the cost-effectiveness depends largely on the prevalence of HCV. The evidence base for different HCV testing approaches in LMICs is limited, minimizing the contribution of cost-effectiveness data alone to decision-making and recommendations on testing approaches in the 2017 WHO viral hepatitis testing guidelines. Overall, the guidelines recommended focused testing in high risk-groups, particularly PWID, prisoners, and men who have sex with men; with consideration of two other approaches:- birth cohort testing in those countries with epidemiological evidence of a significant birth cohort effect; and routine access to testing across the general population in those countries with a high HCV seroprevalence above 2% - 5% in the general population. Further implementation research on different testing approaches is needed in order to help guide national policy planning.

  7. Innovative Approaches to Assessment of Results of Higher School Students Training

    ERIC Educational Resources Information Center

    Vaganova, Olga I.; Medvedeva, Tatiana Yu.; Kirdyanova, Elena R.; Kazantseva, Galina A.; Karpukova, Albina A.

    2016-01-01

    The basis of assessment tools selection for performance of control and evaluation of training results subject to requirements of modular-competence approach has been disclosed. The experience in implementation of assessment tools during "General and professional pedagogy" course has been observed. The objective of the study is rationale…

  8. Least Squares Method for Equating Logistic Ability Scales: A General Approach and Evaluation. Iowa Testing Programs Occasional Papers, Number 30.

    ERIC Educational Resources Information Center

    Haebara, Tomokazu

    When several ability scales in item response models are separately derived from different test forms administered to different samples of examinees, these scales must be equated to a common scale because their units and origins are arbitrarily determined and generally different from scale to scale. A general method for equating logistic ability…

  9. Evaluation of image quality

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    This presentation outlines in viewgraph format a general approach to the evaluation of display system quality for aviation applications. This approach is based on the assumption that it is possible to develop a model of the display which captures most of the significant properties of the display. The display characteristics should include spatial and temporal resolution, intensity quantizing effects, spatial sampling, delays, etc. The model must be sufficiently well specified to permit generation of stimuli that simulate the output of the display system. The first step in the evaluation of display quality is an analysis of the tasks to be performed using the display. Thus, for example, if a display is used by a pilot during a final approach, the aesthetic aspects of the display may be less relevant than its dynamic characteristics. The opposite task requirements may apply to imaging systems used for displaying navigation charts. Thus, display quality is defined with regard to one or more tasks. Given a set of relevant tasks, there are many ways to approach display evaluation. The range of evaluation approaches includes visual inspection, rapid evaluation, part-task simulation, and full mission simulation. The work described is focused on two complementary approaches to rapid evaluation. The first approach is based on a model of the human visual system. A model of the human visual system is used to predict the performance of the selected tasks. The model-based evaluation approach permits very rapid and inexpensive evaluation of various design decisions. The second rapid evaluation approach employs specifically designed critical tests that embody many important characteristics of actual tasks. These are used in situations where a validated model is not available. These rapid evaluation tests are being implemented in a workstation environment.

  10. Participatory Design, User Involvement and Health IT Evaluation.

    PubMed

    Kushniruk, Andre; Nøhr, Christian

    2016-01-01

    End user involvement and input into the design and evaluation of information systems has been recognized as being a critical success factor in the adoption of information systems. Nowhere is this need more critical than in the design of health information systems. Consistent with evidence from the general software engineering literature, the degree of user input into design of complex systems has been identified as one of the most important factors in the success or failure of complex information systems. The participatory approach goes beyond user-centered design and co-operative design approaches to include end users as more active participants in design ideas and decision making. Proponents of participatory approaches argue for greater end user participation in both design and evaluative processes. Evidence regarding the effectiveness of increased user involvement in design is explored in this contribution in the context of health IT. The contribution will discuss several approaches to including users in design and evaluation. Challenges in IT evaluation during participatory design will be described and explored along with several case studies.

  11. Practical Evaluation and Management of Atrophic Acne Scars

    PubMed Central

    2011-01-01

    Atrophic acne scarring is an unfortunate, permanent complication of acne vulgaris, which may be associated with significant psychological distress. General dermatologists are frequently presented with the challenge of evaluating and providing treatment recommendations to patients with acne scars. This article reviews a practical, step-by-step approach to evaluating the patient with atrophic acne scars. An algorithm for providing treatment options is presented, along with pitfalls to avoid. A few select procedures that may be incorporated into a general dermatology practice are reviewed in greater detail, including filler injections, skin needling, and the punch excision. PMID:21909457

  12. The Number of Feedbacks Needed for Reliable Evaluation. A Multilevel Analysis of the Reliability, Stability and Generalisability of Students' Evaluation of Teaching

    ERIC Educational Resources Information Center

    Rantanen, Pekka

    2013-01-01

    A multilevel analysis approach was used to analyse students' evaluation of teaching (SET). The low value of inter-rater reliability stresses that any solid conclusions on teaching cannot be made on the basis of single feedbacks. To assess a teacher's general teaching effectiveness, one needs to evaluate four randomly chosen course implementations.…

  13. Counselors' Evaluation of Rogers-Perls-Ellis's Relationship Skills

    ERIC Educational Resources Information Center

    Woodward, Wallace S.; And Others

    1975-01-01

    Participants (12 employment counselors and 10 counselor supervisors) attending a three-week workshop on enhancing relationship skills, evaluated the Rogers, Perls, Ellis film, Three Approaches to Psychotherapy, on 15 skills. Results indicate there was general agreement between the counselors and the supervisors when judging levels of therapist…

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi

    This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process. The objective of this study is to verify the generalized body-modes approach in comparison to high-fidelity FSI simulations to accurately predict structural deflections and stress loads in a WEC. Two verification cases are considered, a free-floating barge and a fixed-bottom column. Details for both the generalized body-modes models and FSI models are first provided. Results for each of the models are then compared and discussed. Finally, based on the verification results obtained, future plans for incorporating the generalized body-modes method into the WEC simulation tool, WEC-Sim, and the overall WEC design process are discussed.« less

  15. Priority screening of toxic chemicals and industry sectors in the U.S. toxics release inventory: a comparison of the life cycle impact-based and risk-based assessment tools developed by U.S. EPA.

    PubMed

    Lim, Seong-Rin; Lam, Carl W; Schoenung, Julie M

    2011-09-01

    Life Cycle Impact Assessment (LCIA) and Risk Assessment (RA) employ different approaches to evaluate toxic impact potential for their own general applications. LCIA is often used to evaluate toxicity potentials for corporate environmental management and RA is often used to evaluate a risk score for environmental policy in government. This study evaluates the cancer, non-cancer, and ecotoxicity potentials and risk scores of chemicals and industry sectors in the United States on the basis of the LCIA- and RA-based tools developed by U.S. EPA, and compares the priority screening of toxic chemicals and industry sectors identified with each method to examine whether the LCIA- and RA-based results lead to the same prioritization schemes. The Tool for the Reduction and Assessment of Chemical and other environmental Impacts (TRACI) is applied as an LCIA-based screening approach with a focus on air and water emissions, and the Risk-Screening Environmental Indicator (RSEI) is applied in equivalent fashion as an RA-based screening approach. The U.S. Toxic Release Inventory is used as the dataset for this analysis, because of its general applicability to a comprehensive list of chemical substances and industry sectors. Overall, the TRACI and RSEI results do not agree with each other in part due to the unavailability of characterization factors and toxic scores for select substances, but primarily because of their different evaluation approaches. Therefore, TRACI and RSEI should be used together both to support a more comprehensive and robust approach to screening of chemicals for environmental management and policy and to highlight substances that are found to be of concern from both perspectives. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. A method for evaluating models that use galaxy rotation curves to derive the density profiles

    NASA Astrophysics Data System (ADS)

    de Almeida, Álefe O. F.; Piattella, Oliver F.; Rodrigues, Davi C.

    2016-11-01

    There are some approaches, either based on General Relativity (GR) or modified gravity, that use galaxy rotation curves to derive the matter density of the corresponding galaxy, and this procedure would either indicate a partial or a complete elimination of dark matter in galaxies. Here we review these approaches, clarify the difficulties on this inverted procedure, present a method for evaluating them, and use it to test two specific approaches that are based on GR: the Cooperstock-Tieu (CT) and the Balasin-Grumiller (BG) approaches. Using this new method, we find that neither of the tested approaches can satisfactorily fit the observational data without dark matter. The CT approach results can be significantly improved if some dark matter is considered, while for the BG approach no usual dark matter halo can improve its results.

  17. Using Art For Health Promotion: Evaluating an In-School Program Through Student Perspectives.

    PubMed

    McKay, Fiona H; McKenzie, Hayley

    2017-09-01

    The value of incorporating arts-based approaches into health promotion programs has long been recognized as useful in affecting change. Such approaches have been used in many schools across Australia and have been found to promote general well-being and mental health. Despite these positive findings, few programs have used or evaluated an integrated arts-based approach to achieve health and well-being goals. This article presents the findings of an evaluation of an integrated arts-based program focused on creativity and improving well-being in students. The findings of this evaluation suggest that students who took part in the program were more interested in art and music at the end of the program and had gained an overall increase in awareness and mindfulness and a positivity toward leisure activities. This evaluation provides some evidence to suggest that this type of program is a promising way to promote well-being in schools.

  18. Kinetic Rate Kernels via Hierarchical Liouville-Space Projection Operator Approach.

    PubMed

    Zhang, Hou-Dao; Yan, YiJing

    2016-05-19

    Kinetic rate kernels in general multisite systems are formulated on the basis of a nonperturbative quantum dissipation theory, the hierarchical equations of motion (HEOM) formalism, together with the Nakajima-Zwanzig projection operator technique. The present approach exploits the HEOM-space linear algebra. The quantum non-Markovian site-to-site transfer rate can be faithfully evaluated via projected HEOM dynamics. The developed method is exact, as evident by the comparison to the direct HEOM evaluation results on the population evolution.

  19. The Strategic Evaluation of Regional Development in Higher Education

    ERIC Educational Resources Information Center

    Kettunen, Juha

    2004-01-01

    The study analyses the role of regional development in higher education using the approach of the balanced scorecard, which provides a framework for organizations to describe and communicate their strategy. It turns out that the balanced scorecard is not only an approach for implementing the strategy, but it also provides a general framework for…

  20. An evaluation of risk estimation procedures for mixtures of carcinogens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, J.S.; Chen, J.J.

    1999-12-01

    The estimation of health risks from exposure to a mixture of chemical carcinogens is generally based on the combination of information from several available single compound studies. The current practice of directly summing the upper bound risk estimates of individual carcinogenic components as an upper bound on the total risk of a mixture is known to be generally too conservative. Gaylor and Chen (1996, Risk Analysis) proposed a simple procedure to compute an upper bound on the total risk using only the upper confidence limits and central risk estimates of individual carcinogens. The Gaylor-Chen procedure was derived based on anmore » underlying assumption of the normality for the distributions of individual risk estimates. IN this paper the authors evaluated the Gaylor-Chen approach in terms the coverages of the upper confidence limits on the true risks of individual carcinogens. In general, if the coverage probabilities for the individual carcinogens are all approximately equal to the nominal level, then the Gaylor-Chen approach should perform well. However, the Gaylor-Chen approach can be conservative or anti-conservative if some of all individual upper confidence limit estimates are conservative or anti-conservative.« less

  1. Determining Safety Inspection Thresholds for Employee Incentives Programs on Construction Sites

    PubMed Central

    Sparer, Emily; Dennerlein, Jack

    2017-01-01

    The goal of this project was to evaluate approaches of determining the numerical value of a safety inspection score that would activate a reward in an employee safety incentive program. Safety inspections are a reflection of the physical working conditions at a construction site and provide a safety score that can be used in incentive programs to reward workers. Yet it is unclear what level of safety should be used when implementing this kind of program. This study explored five ways of grouping safety inspection data collected during 19 months at Harvard University-owned construction projects. Each approach grouped the data by one of the following: owner, general contractor, project, trade, or subcontractor. The median value for each grouping provided the threshold score. These five approaches were then applied to data from a completed project in order to calculate the frequency and distribution of rewards in a monthly safety incentive program. The application of each approach was evaluated qualitatively for consistency, competitiveness, attainability, and fairness. The owner-specific approach resulted in a threshold score of 96.3% and met all of the qualitative evaluation goals. It had the most competitive reward distribution (only 1/3 of the project duration) yet it was also attainable. By treating all workers equally and maintaining the same value throughout the project duration, this approach was fair and consistent. The owner-based approach for threshold determination can be used by owners or general contractors when creating leading indicator incentives programs and by researchers in future studies on incentive program effectiveness. PMID:28638178

  2. Intelligent interface design and evaluation

    NASA Technical Reports Server (NTRS)

    Greitzer, Frank L.

    1988-01-01

    Intelligent interface concepts and systematic approaches to assessing their functionality are discussed. Four general features of intelligent interfaces are described: interaction efficiency, subtask automation, context sensitivity, and use of an appropriate design metaphor. Three evaluation methods are discussed: Functional Analysis, Part-Task Evaluation, and Operational Testing. Design and evaluation concepts are illustrated with examples from a prototype expert system interface for environmental control and life support systems for manned space platforms.

  3. Generalized ray tracing method for the calculation of the peripheral refraction induced by an ophthalmic lens

    NASA Astrophysics Data System (ADS)

    Rojo, Pilar; Royo, Santiago; Caum, Jesus; Ramírez, Jorge; Madariaga, Ines

    2015-02-01

    Peripheral refraction, the refractive error present outside the main direction of gaze, has lately attracted interest due to its alleged relationship with the progression of myopia. The ray tracing procedures involved in its calculation need to follow an approach different from those used in conventional ophthalmic lens design, where refractive errors are compensated only in the main direction of gaze. We present a methodology for the evaluation of the peripheral refractive error in ophthalmic lenses, adapting the conventional generalized ray tracing approach to the requirements of the evaluation of peripheral refraction. The nodal point of the eye and a retinal conjugate surface will be used to evaluate the three-dimensional distribution of refractive error around the fovea. The proposed approach enables us to calculate the three-dimensional peripheral refraction induced by any ophthalmic lens at any direction of gaze and to personalize the lens design to the requirements of the user. The complete evaluation process for a given user prescribed with a -5.76D ophthalmic lens for foveal vision is detailed, and comparative results obtained when the geometry of the lens is modified and when the central refractive error is over- or undercorrected. The methodology is also applied for an emmetropic eye to show its application for refractive errors other than myopia.

  4. Towards a Generalizable Time Expression Model for Temporal Reasoning in Clinical Notes

    PubMed Central

    Velupillai, Sumithra; Mowery, Danielle L.; Abdelrahman, Samir; Christensen, Lee; Chapman, Wendy W

    2015-01-01

    Accurate temporal identification and normalization is imperative for many biomedical and clinical tasks such as generating timelines and identifying phenotypes. A major natural language processing challenge is developing and evaluating a generalizable temporal modeling approach that performs well across corpora and institutions. Our long-term goal is to create such a model. We initiate our work on reaching this goal by focusing on temporal expression (TIMEX3) identification. We present a systematic approach to 1) generalize existing solutions for automated TIMEX3 span detection, and 2) assess similarities and differences by various instantiations of TIMEX3 models applied on separate clinical corpora. When evaluated on the 2012 i2b2 and the 2015 Clinical TempEval challenge corpora, our conclusion is that our approach is successful – we achieve competitive results for automated classification, and we identify similarities and differences in TIMEX3 modeling that will be informative in the development of a simplified, general temporal model. PMID:26958265

  5. Learning Grasp Context Distinctions that Generalize

    NASA Technical Reports Server (NTRS)

    Platt, Robert; Grupen, Roderic A.; Fagg, Andrew H.

    2006-01-01

    Control-based approaches to grasp synthesis create grasping behavior by sequencing and combining control primitives. In the absence of any other structure, these approaches must evaluate a large number of feasible control sequences as a function of object shape, object pose, and task. This work explores a new approach to grasp synthesis that limits consideration to variations on a generalized localize-reach-grasp control policy. A new learning algorithm, known as schema structured learning, is used to learn which instantiations of the generalized policy are most likely to lead to a successful grasp in different problem contexts. Two experiments are described where Dexter, a bimanual upper torso, learns to select an appropriate grasp strategy as a function of object eccentricity and orientation. In addition, it is shown that grasp skills learned in this way can generalize to new objects. Results are presented showing that after learning how to grasp a small, representative set of objects, the robot's performance quantitatively improves for similar objects that it has not experienced before.

  6. Psychotherapy approaches for adult survivors of childhood sexual abuse: an integrative review of outcomes research.

    PubMed

    Martsolf, Donna S; Draucker, Claire B

    2005-10-01

    This review synthesized results of 26 outcomes research studies and two meta-analyses that evaluated abuse-focused psychotherapy techniques for survivors of childhood sexual abuse. Different therapeutic approaches delivered in individual, group, or combination formats were evaluated with pre/post test, quasi-experimental, or randomized control designs. Accumulated research findings suggest that abuse-focused psychotherapy for adults sexually abused as children is generally beneficial in reducing psychiatric distress, depression, and trauma-specific symptoms. No one therapeutic approach was demonstrated to be superior. There was little evidence about the effectiveness of individual versus group therapy or the optimal treatment duration.

  7. On the Formulation of Anisotropic-Polyaxial Failure Criteria: A Comparative Study

    NASA Astrophysics Data System (ADS)

    Parisio, Francesco; Laloui, Lyesse

    2018-02-01

    The correct representation of the failure of geomaterials that feature strength anisotropy and polyaxiality is crucial for many applications. In this contribution, we propose and evaluate through a comparative study a generalized framework that covers both features. Polyaxiality of strength is modeled with a modified Van Eekelen approach, while the anisotropy is modeled using a fabric tensor approach of the Pietruszczak and Mroz type. Both approaches share the same philosophy as they can be applied to simpler failure surfaces, allowing great flexibility in model formulation. The new failure surface is tested against experimental data and its performance compared against classical failure criteria commonly used in geomechanics. Our study finds that the global error between predictions and data is generally smaller for the proposed framework compared to other classical approaches.

  8. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram

    PubMed Central

    2015-01-01

    Objectives Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. Methods This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. Results It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. Conclusions To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions. PMID:26618028

  9. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram.

    PubMed

    Chang, Hyejung

    2015-10-01

    Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions.

  10. Evaluating a Policing Strategy Intended to Disrupt an Illicit Street-Level Drug Market

    ERIC Educational Resources Information Center

    Corsaro, Nicholas; Brunson, Rod K.; McGarrell, Edmund F.

    2010-01-01

    The authors examined a strategic policing initiative that was implemented in a high crime Nashville, Tennessee neighborhood by utilizing a mixed-methodological evaluation approach in order to provide (a) a descriptive process assessment of program fidelity; (b) an interrupted time-series analysis relying upon generalized linear models; (c)…

  11. Design, Implementation, and Evaluation of a Flipped Format General Chemistry Course

    ERIC Educational Resources Information Center

    Weaver, Gabriela C.; SturtevantHannah G.

    2015-01-01

    Research has consistently shown that active problem-solving in a collaborative environment supports more effective learning than the traditional lecture approach. In this study, a flipped classroom format was implemented and evaluated in the chemistry majors' sequence at Purdue University over a period of three years. What was formerly lecture…

  12. The child with developmental delay: An approach to etiology

    PubMed Central

    Meschino, Wendy S

    2003-01-01

    OBJECTIVE: To describe an approach to history, physical examination and investigation for the developmentally delayed child. METHODS: A review of electronic databases from 1997 to 2001 was done searching for articles relating to the approach to or investigations of children with developmental delay. Five studies, including a review of a consensus conference on evaluation of mental retardation, were chosen because of their general approaches to developmental delay and/or mental retardation, or specific evaluations of a particular laboratory investigation. CONCLUSIONS: A diagnosis or cause of mental retardation can be identified in 20% to 60% of cases. Evaluation of the developmentally delayed child should include a detailed history and physical examination, taking special care to record a three-generation pedigree, as well as to look for dysmorphic features. If no other cause is apparent, routine investigations should include a chromosome study and fragile X studies. Further investigations are warranted depending on the clinical features. PMID:20011550

  13. About the Complexities of Video-Based Assessments: Theoretical and Methodological Approaches to Overcoming Shortcomings of Research on Teachers' Competence

    ERIC Educational Resources Information Center

    Kaiser, Gabriele; Busse, Andreas; Hoth, Jessica; König, Johannes; Blömeke, Sigrid

    2015-01-01

    Research on the evaluation of the professional knowledge of mathematics teachers (comprising for example mathematical content knowledge, mathematics pedagogical content knowledge and general pedagogical knowledge) has become prominent in the last decade; however, the development of video-based assessment approaches is a more recent topic. This…

  14. Three Approaches to Using Lengthy Ordinal Scales in Structural Equation Models: Parceling, Latent Scoring, and Shortening Scales

    ERIC Educational Resources Information Center

    Yang, Chongming; Nay, Sandra; Hoyle, Rick H.

    2010-01-01

    Lengthy scales or testlets pose certain challenges for structural equation modeling (SEM) if all the items are included as indicators of a latent construct. Three general approaches to modeling lengthy scales in SEM (parceling, latent scoring, and shortening) have been reviewed and evaluated. A hypothetical population model is simulated containing…

  15. NAAHE Special Report: An Annotated Bibliography of Research Relevant to Humane Education.

    ERIC Educational Resources Information Center

    DeRosa, William

    Entries in this annotated bibliography are presented in four sections. The first section includes studies which attempt to evaluate various approaches to humane education, providing educators with general guidelines on which of the approaches have been found to be the most and least effective for teaching young people about animals and animal…

  16. Using a Process Social Skills Training Approach with Adolescents with Mild Intellectual Disabilities in a High School Setting.

    ERIC Educational Resources Information Center

    O'Reilly, Mark F.; Glynn, Dawn

    1995-01-01

    A process social skills training approach was implemented and evaluated with two high school students having mild intellectual disabilities and social skills deficits. The intervention package was successful in promoting generalization of targeted social skills from the training setting to the classroom for both students. Participants had…

  17. Going for gold: the health promoting general practice.

    PubMed

    Watson, Michael

    2008-01-01

    The World Health Organization's Ottawa Charter for Health Promotion has been influential in guiding the development of 'settings' based health promotion. Over the past decade, settings such as schools have flourished and there has been a considerable amount of academic literature produced, including theoretical papers, descriptive studies and evaluations. However, despite its central importance, the health-promoting general practice has received little attention. This paper discusses: the significance of this setting for health promotion; how a health promoting general practice can be created; effective health promotion approaches; the nursing contribution; and some challenges that need to be resolved. In order to become a health promoting general practice, the staff must undertake a commitment to fulfil the following conditions: create a healthy working environment; integrate health promotion into practice activities; and establish alliances with other relevant institutions and groups within the community. The health promoting general practice is the gold standard for health promotion. Settings that have developed have had the support of local, national and European networks. Similar assistance and advocacy will be needed in general practice. This paper recommends that a series of rigorously evaluated, high-quality pilot sites need to be established to identify and address potential difficulties, and to ensure that this innovative approach yields tangible health benefits for local communities. It also suggests that government support is critical to the future development of health promoting general practices. This will be needed both directly and in relation to the capacity and resourcing of public health in general.

  18. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  19. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  20. Editorial: Cognitive Architectures, Model Comparison and AGI

    NASA Astrophysics Data System (ADS)

    Lebiere, Christian; Gonzalez, Cleotilde; Warwick, Walter

    2010-12-01

    Cognitive Science and Artificial Intelligence share compatible goals of understanding and possibly generating broadly intelligent behavior. In order to determine if progress is made, it is essential to be able to evaluate the behavior of complex computational models, especially those built on general cognitive architectures, and compare it to benchmarks of intelligent behavior such as human performance. Significant methodological challenges arise, however, when trying to extend approaches used to compare model and human performance from tightly controlled laboratory tasks to complex tasks involving more open-ended behavior. This paper describes a model comparison challenge built around a dynamic control task, the Dynamic Stocks and Flows. We present and discuss distinct approaches to evaluating performance and comparing models. Lessons drawn from this challenge are discussed in light of the challenge of using cognitive architectures to achieve Artificial General Intelligence.

  1. Generalized approach for identification and evaluation of technology-insertion options for military avionics systems

    NASA Astrophysics Data System (ADS)

    Harkness, Linda L.; Sjoberg, Eric S.

    1996-06-01

    The Georgia Tech Research Institute, sponsored by the Warner Robins Air Logistics Center, has developed an approach for efficiently postulating and evaluating methods for extending the life of radars and other avionics systems. The technique identified specific assemblies for potential replacement and evaluates the system level impact, including performance, reliability and life-cycle cost of each action. The initial impetus for this research was the increasing obsolescence of integrated circuits contained in the AN/APG-63 system. The operational life of military electronics is typically in excess of twenty years, which encompasses several generations of IC technology. GTRI has developed a systems approach to inserting modern technology components into older systems based upon identification of those functions which limit the system's performance or reliability and which are cost drivers. The presentation will discuss the above methodology and a technique for evaluating and ranking the different potential system upgrade options.

  2. A frequency-domain approach to improve ANNs generalization quality via proper initialization.

    PubMed

    Chaari, Majdi; Fekih, Afef; Seibi, Abdennour C; Hmida, Jalel Ben

    2018-08-01

    The ability to train a network without memorizing the input/output data, thereby allowing a good predictive performance when applied to unseen data, is paramount in ANN applications. In this paper, we propose a frequency-domain approach to evaluate the network initialization in terms of quality of training, i.e., generalization capabilities. As an alternative to the conventional time-domain methods, the proposed approach eliminates the approximate nature of network validation using an excess of unseen data. The benefits of the proposed approach are demonstrated using two numerical examples, where two trained networks performed similarly on the training and the validation data sets, yet they revealed a significant difference in prediction accuracy when tested using a different data set. This observation is of utmost importance in modeling applications requiring a high degree of accuracy. The efficiency of the proposed approach is further demonstrated on a real-world problem, where unlike other initialization methods, a more conclusive assessment of generalization is achieved. On the practical front, subtle methodological and implementational facets are addressed to ensure reproducibility and pinpoint the limitations of the proposed approach. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. A generalized estimating equations approach for resting-state functional MRI group analysis.

    PubMed

    D'Angelo, Gina M; Lazar, Nicole A; Eddy, William F; Morris, John C; Sheline, Yvette I

    2011-01-01

    An Alzheimer's fMRI study has motivated us to evaluate inter-regional correlations between groups. The overall objective is to assess inter-regional correlations at a resting-state with no stimulus or task. We propose using a generalized estimating equation (GEE) transition model and a GEE marginal model to model the within-subject correlation for each region. Residuals calculated from the GEE models are used to correlate brain regions and assess between group differences. The standard pooling approach of group averages of the Fisher-z transformation assuming temporal independence is a typical approach used to compare group correlations. The GEE approaches and standard Fisher-z pooling approach are demonstrated with an Alzheimer's disease (AD) connectivity study in a population of AD subjects and healthy control subjects. We also compare these methods using simulation studies and show that the transition model may have better statistical properties.

  4. Piloted simulation study of an ILS approach of a twin-pusher business/commuter turboprop aircraft configuration

    NASA Technical Reports Server (NTRS)

    Riley, Donald R.; Brandon, Jay M.; Glaab, Louis J.

    1994-01-01

    A six-degree-of-freedom nonlinear simulation of a twin-pusher, turboprop business/commuter aircraft configuration representative of the Cessna ATPTB (Advanced turboprop test bed) was developed for use in piloted studies with the Langley General Aviation Simulator. The math models developed are provided, simulation predictions are compared with with Cessna flight-test data for validation purposes, and results of a handling quality study during simulated ILS (instrument landing system) approaches and missed approaches are presented. Simulated flight trajectories, task performance measures, and pilot evaluations are presented for the ILS approach and missed-approach tasks conducted with the vehicle in the presence of moderate turbulence, varying horizontal winds and engine-out conditions. Six test subjects consisting of two research pilots, a Cessna test pilot, and three general aviation pilots participated in the study. This effort was undertaken in cooperation with the Cessna Aircraft Company.

  5. Recovering Intrinsic Fragmental Vibrations Using the Generalized Subsystem Vibrational Analysis.

    PubMed

    Tao, Yunwen; Tian, Chuan; Verma, Niraj; Zou, Wenli; Wang, Chao; Cremer, Dieter; Kraka, Elfi

    2018-05-08

    Normal vibrational modes are generally delocalized over the molecular system, which makes it difficult to assign certain vibrations to specific fragments or functional groups. We introduce a new approach, the Generalized Subsystem Vibrational Analysis (GSVA), to extract the intrinsic fragmental vibrations of any fragment/subsystem from the whole system via the evaluation of the corresponding effective Hessian matrix. The retention of the curvature information with regard to the potential energy surface for the effective Hessian matrix endows our approach with a concrete physical basis and enables the normal vibrational modes of different molecular systems to be legitimately comparable. Furthermore, the intrinsic fragmental vibrations act as a new link between the Konkoli-Cremer local vibrational modes and the normal vibrational modes.

  6. Models for forecasting hospital bed requirements in the acute sector.

    PubMed Central

    Farmer, R D; Emami, J

    1990-01-01

    STUDY OBJECTIVE--The aim was to evaluate the current approach to forecasting hospital bed requirements. DESIGN--The study was a time series and regression analysis. The time series for mean duration of stay for general surgery in the age group 15-44 years (1969-1982) was used in the evaluation of different methods of forecasting future values of mean duration of stay and its subsequent use in the formation of hospital bed requirements. RESULTS--It has been suggested that the simple trend fitting approach suffers from model specification error and imposes unjustified restrictions on the data. Time series approach (Box-Jenkins method) was shown to be a more appropriate way of modelling the data. CONCLUSION--The simple trend fitting approach is inferior to the time series approach in modelling hospital bed requirements. PMID:2277253

  7. An Exact, Compressible One-Dimensional Riemann Solver for General, Convex Equations of State

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, James Russell

    2015-03-05

    This note describes an algorithm with which to compute numerical solutions to the one- dimensional, Cartesian Riemann problem for compressible flow with general, convex equations of state. While high-level descriptions of this approach are to be found in the literature, this note contains most of the necessary details required to write software for this problem. This explanation corresponds to the approach used in the source code that evaluates solutions for the 1D, Cartesian Riemann problem with a JWL equation of state in the ExactPack package [16, 29]. Numerical examples are given with the proposed computational approach for a polytropic equationmore » of state and for the JWL equation of state.« less

  8. Merit Evaluation Of Competitors In Debate And Recitation Competitions By Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Mukherjee, Supratim; Bhattacharyya, Rupak; Chatterjee, Amitava; Kar, Samarjit

    2010-10-01

    Co-curricular activities have a great importance in students' life, especially to grow their personality and communication skills. In different process of evaluating competitors in such competitions, generally crisp techniques are used. In this paper, we introduce a new fuzzy set theory based method of evaluation of competitors in co-curricular activities like debate and recitation competitions. The proposed method is illustrated by two examples.

  9. Cookbook Versus Creative Chemistry

    ERIC Educational Resources Information Center

    Venkatachelam, Chaya; Rudolph, R. W.

    1974-01-01

    A new approach to a research-oriented general chemistry laboratory is described. Objectives for the laboratory program are specified, details are provided concerning the program design, and the results of an experiment to evaluate the program are reported. (DT)

  10. Computer-Assisted Performance Evaluation for Navy Anti-Air Warfare Training: Concepts, Methods, and Constraints.

    ERIC Educational Resources Information Center

    Chesler, David J.

    An improved general methodological approach for the development of computer-assisted evaluation of trainee performance in the computer-based simulation environment is formulated in this report. The report focuses on the Tactical Advanced Combat Direction and Electronic Warfare system (TACDEW) at the Fleet Anti-Air Warfare Training Center at San…

  11. A Conceptual Framework to Help Evaluate the Quality of Institutional Performance

    ERIC Educational Resources Information Center

    Kettunen, Juha

    2008-01-01

    Purpose: This study aims to present a general conceptual framework which can be used to evaluate quality and institutional performance in higher education. Design/methodology/approach: The quality of higher education is at the heart of the setting up of the European Higher Education Area. Strategic management is widely used in higher education…

  12. Improving Schools through Evaluation: The Experience of Catholic Schools in South Africa

    ERIC Educational Resources Information Center

    Potterton, Mark; Northmore, Colin

    2014-01-01

    This article addresses the development of quality assurance approaches in South Africa, with particular reference to Catholic schools. It also addresses questions of why whole school evaluation in general has failed to play any meaningful role in improving the quality of schools in South Africa. Reference is also made to specific school cases. The…

  13. An Assessment of the Nonparametric Approach for Evaluating the Fit of Item Response Models

    ERIC Educational Resources Information Center

    Liang, Tie; Wells, Craig S.; Hambleton, Ronald K.

    2014-01-01

    As item response theory has been more widely applied, investigating the fit of a parametric model becomes an important part of the measurement process. There is a lack of promising solutions to the detection of model misfit in IRT. Douglas and Cohen introduced a general nonparametric approach, RISE (Root Integrated Squared Error), for detecting…

  14. Combining Qualitative and Quantitative Approaches: Some Arguments for Mixed Methods Research

    ERIC Educational Resources Information Center

    Lund, Thorleif

    2012-01-01

    One purpose of the present paper is to elaborate 4 general advantages of the mixed methods approach. Another purpose is to propose a 5-phase evaluation design, and to demonstrate its usefulness for mixed methods research. The account is limited to research on groups in need of treatment, i.e., vulnerable groups, and the advantages of mixed methods…

  15. A practical assessment of physician biopsychosocial performance.

    PubMed

    Margalit, Alon Pa; Glick, Shimon M; Benbassat, Jochanan; Cohen, Ayala; Margolis, Carmi Z

    2007-10-01

    A biopsychosocial approach to care seems to improve patient satisfaction and health outcomes. Nevertheless, this approach is not widely practiced, possibly because its precepts have not been translated into observable skills. To identify the skill components of a biopsychosocial consultation and develop an tool for their evaluation. We approached three e-mail discussion groups of family physicians and pooled their responses to the question "what types of observed physician behavior would characterize a biopsychosocial consultation?" We received 35 responses describing 37 types of behavior, all of which seemed to cluster around one of three aspects: patient-centered interview; system-centered and family-centered approach to care; or problem-solving orientation. Using these categories, we developed a nine-item evaluation tool. We used the evaluation tool to score videotaped encounters of patients with two types of doctors: family physicians who were identified by peer ratings to have a highly biopsychosocial orientation (n = 9) or a highly biomedical approach (n = 4); and 44 general practitioners, before and after they had participated in a program that taught a biopsychosocial approach to care. The evaluation tool was found to demonstrate high reliability (alpha = 0.90) and acceptable interobserver variability. The average scores of the physicians with a highly biopsychosocial orientation were significantly higher than those of physicians with a highly biomedical approach. There were significant differences between the scores of the teaching-program participants before and after the program. A biopsychosocial approach to patient care can be characterized using a valid and easy-to-apply evaluation tool.

  16. Graphical Evaluation of the Ridge-Type Robust Regression Estimators in Mixture Experiments

    PubMed Central

    Erkoc, Ali; Emiroglu, Esra

    2014-01-01

    In mixture experiments, estimation of the parameters is generally based on ordinary least squares (OLS). However, in the presence of multicollinearity and outliers, OLS can result in very poor estimates. In this case, effects due to the combined outlier-multicollinearity problem can be reduced to certain extent by using alternative approaches. One of these approaches is to use biased-robust regression techniques for the estimation of parameters. In this paper, we evaluate various ridge-type robust estimators in the cases where there are multicollinearity and outliers during the analysis of mixture experiments. Also, for selection of biasing parameter, we use fraction of design space plots for evaluating the effect of the ridge-type robust estimators with respect to the scaled mean squared error of prediction. The suggested graphical approach is illustrated on Hald cement data set. PMID:25202738

  17. Graphical evaluation of the ridge-type robust regression estimators in mixture experiments.

    PubMed

    Erkoc, Ali; Emiroglu, Esra; Akay, Kadri Ulas

    2014-01-01

    In mixture experiments, estimation of the parameters is generally based on ordinary least squares (OLS). However, in the presence of multicollinearity and outliers, OLS can result in very poor estimates. In this case, effects due to the combined outlier-multicollinearity problem can be reduced to certain extent by using alternative approaches. One of these approaches is to use biased-robust regression techniques for the estimation of parameters. In this paper, we evaluate various ridge-type robust estimators in the cases where there are multicollinearity and outliers during the analysis of mixture experiments. Also, for selection of biasing parameter, we use fraction of design space plots for evaluating the effect of the ridge-type robust estimators with respect to the scaled mean squared error of prediction. The suggested graphical approach is illustrated on Hald cement data set.

  18. A Unified Approach to Electromagnetic Wave Propagation in Turbulence and the Evaluation of Multiparameter Integrals

    DTIC Science & Technology

    1988-07-01

    The solutions in some cases have been made more general , as in the papers of Fried2 and Tyler3 by defining normnalized quantities; the tabular and... generalized hypergeometric functions. For that case, he shows that the integral, which can be transformed into a Mellin- Barnes integral, can be expressed as a...finite sum of generalized hypergeometric functions which are equivalent to a Meijer’s G-function. He briefly considers the case in which the

  19. How to Modify (Implicit) Evaluations of Fear-Related Stimuli: Effects of Feature-Specific Attention Allocation

    PubMed Central

    Vanaelst, Jolien; Spruyt, Adriaan; De Houwer, Jan

    2016-01-01

    We demonstrate that feature-specific attention allocation influences the way in which repeated exposure modulates implicit and explicit evaluations toward fear-related stimuli. During an exposure procedure, participants were encouraged to assign selective attention either to the evaluative meaning (i.e., Evaluative Condition) or a non-evaluative, semantic feature (i.e., Semantic Condition) of fear-related stimuli. The influence of the exposure procedure was captured by means of a measure of implicit evaluation, explicit evaluative ratings, and a measure of automatic approach/avoidance tendencies. As predicted, the implicit measure of evaluation revealed a reduced expression of evaluations in the Semantic Condition as compared to the Evaluative Condition. Moreover, this effect generalized toward novel objects that were never presented during the exposure procedure. The explicit measure of evaluation mimicked this effect, although it failed to reach conventional levels of statistical significance. No effects were found in terms of automatic approach/avoidance tendencies. Potential implications for the treatment of anxiety disorders are discussed. PMID:27242626

  20. How to Modify (Implicit) Evaluations of Fear-Related Stimuli: Effects of Feature-Specific Attention Allocation.

    PubMed

    Vanaelst, Jolien; Spruyt, Adriaan; De Houwer, Jan

    2016-01-01

    We demonstrate that feature-specific attention allocation influences the way in which repeated exposure modulates implicit and explicit evaluations toward fear-related stimuli. During an exposure procedure, participants were encouraged to assign selective attention either to the evaluative meaning (i.e., Evaluative Condition) or a non-evaluative, semantic feature (i.e., Semantic Condition) of fear-related stimuli. The influence of the exposure procedure was captured by means of a measure of implicit evaluation, explicit evaluative ratings, and a measure of automatic approach/avoidance tendencies. As predicted, the implicit measure of evaluation revealed a reduced expression of evaluations in the Semantic Condition as compared to the Evaluative Condition. Moreover, this effect generalized toward novel objects that were never presented during the exposure procedure. The explicit measure of evaluation mimicked this effect, although it failed to reach conventional levels of statistical significance. No effects were found in terms of automatic approach/avoidance tendencies. Potential implications for the treatment of anxiety disorders are discussed.

  1. Reliability Evaluation of V730 Transmission

    DOT National Transportation Integrated Search

    1982-10-01

    The Detroit Diesel Allison V730 transmission is a heavy duty, automatic, 3-speed, hydraulic transmission, currently installed in full size (35' and 40') transit buses with transverse mounted rear engines. This report presents the general approach and...

  2. A Study of General Education Astronomy Students' Understandings of Cosmology. Part II. Evaluating Four Conceptual Cosmology Surveys: A Classical Test Theory Approach

    ERIC Educational Resources Information Center

    Wallace, Colin S.; Prather, Edward E.; Duncan, Douglas K.

    2011-01-01

    This is the second of five papers detailing our national study of general education astronomy students' conceptual and reasoning difficulties with cosmology. This article begins our quantitative investigation of the data. We describe how we scored students' responses to four conceptual cosmology surveys, and we present evidence for the inter-rater…

  3. Simplified and refined finite element approaches for determining stresses and internal forces in geometrically nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Robinson, J. C.

    1979-01-01

    Two methods for determining stresses and internal forces in geometrically nonlinear structural analysis are presented. The simplified approach uses the mid-deformed structural position to evaluate strains when rigid body rotation is present. The important feature of this approach is that it can easily be used with a general-purpose finite-element computer program. The refined approach uses element intrinsic or corotational coordinates and a geometric transformation to determine element strains from joint displacements. Results are presented which demonstrate the capabilities of these potentially useful approaches for geometrically nonlinear structural analysis.

  4. Scheduling double round-robin tournaments with divisional play using constraint programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey

    We study a tournament format that extends a traditional double round-robin format with divisional single round-robin tournaments. Elitserien, the top Swedish handball league, uses such a format for its league schedule. We present a constraint programming model that characterizes the general double round-robin plus divisional single round-robin format. This integrated model allows scheduling to be performed in a single step, as opposed to common multistep approaches that decompose scheduling into smaller problems and possibly miss optimal solutions. In addition to general constraints, we introduce Elitserien-specific requirements for its tournament. These general and league-specific constraints allow us to identify implicit andmore » symmetry-breaking properties that reduce the time to solution from hours to seconds. A scalability study of the number of teams shows that our approach is reasonably fast for even larger league sizes. The experimental evaluation of the integrated approach takes considerably less computational effort to schedule Elitserien than does the previous decomposed approach. (C) 2016 Elsevier B.V. All rights reserved« less

  5. Ordinal optimization and its application to complex deterministic problems

    NASA Astrophysics Data System (ADS)

    Yang, Mike Shang-Yu

    1998-10-01

    We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.

  6. Fair market value: taking a proactive approach.

    PubMed

    Romero, Richard A

    2008-04-01

    A valuation report assessing the fair market value of a contractual arrangement should include: A description of the company, entity, or circumstance being valued. Analysis of general economic conditions that are expected to affect the enterprise. Evaluation of economic conditions in the medical services industry. Explanation of the various valuation approaches that were considered. Documentation of key underlying assumptions, including revenue and expense projections, projected profit, and ROI.

  7. Incorporating Asymmetric Dependency Patterns in the Evaluation of IS/IT projects Using Real Option Analysis

    ERIC Educational Resources Information Center

    Burke, John C.

    2012-01-01

    The objective of my dissertation is to create a general approach to evaluating IS/IT projects using Real Option Analysis (ROA). This is an important problem because an IT Project Portfolio (ITPP) can represent hundreds of projects, millions of dollars of investment and hundreds of thousands of employee hours. Therefore, any advance in the…

  8. Historical range of variability in live and dead wood biomass: a regional-scale simulation study

    Treesearch

    Etsuko Nonaka; Thomas A. Spies; Michael C. Wimberly; Janet L. Ohmann

    2007-01-01

    The historical range of variability (HRV) in landscape structure and composition created by natural disturbance can serve as a general guide for evaluating ecological conditions of managed landscapes. HRV approaches to evaluating landscapes have been based on age classes or developmental stages, which may obscure variation in live and dead stand structure. Developing...

  9. The Effect of Instructional Objectives and General Objectives on Student Self-Evaluation of Psychomotor Performance in Power Mechanics.

    ERIC Educational Resources Information Center

    Janeczko, Robert John

    The major purpose of this study was to ascertain the relative effects of student exposure to instructional objectives upon student self-evaluation of psychomotor activities in a college-level power mechanics course. A randomized posttest-only control group design was used with two different approaches to the statement of the objectives. Four…

  10. Comparison of Predicted Thermoelectric Energy Conversion Efficiency by Cumulative Properties and Reduced Variables Approaches

    NASA Astrophysics Data System (ADS)

    Linker, Thomas M.; Lee, Glenn S.; Beekman, Matt

    2018-06-01

    The semi-analytical methods of thermoelectric energy conversion efficiency calculation based on the cumulative properties approach and reduced variables approach are compared for 21 high performance thermoelectric materials. Both approaches account for the temperature dependence of the material properties as well as the Thomson effect, thus the predicted conversion efficiencies are generally lower than that based on the conventional thermoelectric figure of merit ZT for nearly all of the materials evaluated. The two methods also predict material energy conversion efficiencies that are in very good agreement which each other, even for large temperature differences (average percent difference of 4% with maximum observed deviation of 11%). The tradeoff between obtaining a reliable assessment of a material's potential for thermoelectric applications and the complexity of implementation of the three models, as well as the advantages of using more accurate modeling approaches in evaluating new thermoelectric materials, are highlighted.

  11. Characterization of redox conditions in groundwater contaminant plumes

    NASA Astrophysics Data System (ADS)

    Christensen, Thomas H.; Bjerg, Poul L.; Banwart, Steven A.; Jakobsen, Rasmus; Heron, Gorm; Albrechtsen, Hans-Jørgen

    2000-10-01

    Evaluation of redox conditions in groundwater pollution plumes is often a prerequisite for understanding the behaviour of the pollutants in the plume and for selecting remediation approaches. Measuring of redox conditions in pollution plumes is, however, a fairly recent issue and yet relative few cases have been reported. No standardised or generally accepted approach exists. Slow electrode kinetics and the common lack of internal equilibrium of redox processes in pollution plumes make, with a few exceptions, direct electrochemical measurement and rigorous interpretation of redox potentials dubious, if not erroneous. Several other approaches have been used in addressing redox conditions in pollution plumes: redox-sensitive compounds in groundwater samples, hydrogen concentrations in groundwater, concentrations of volatile fatty acids in groundwater, sediment characteristics and microbial tools, such as MPN counts, PLFA biomarkers and redox bioassays. This paper reviews the principles behind the different approaches, summarizes methods used and evaluates the approaches based on the experience from the reported applications.

  12. Geometric constraints in semiclassical initial value representation calculations in Cartesian coordinates: accurate reduction in zero-point energy.

    PubMed

    Issack, Bilkiss B; Roy, Pierre-Nicholas

    2005-08-22

    An approach for the inclusion of geometric constraints in semiclassical initial value representation calculations is introduced. An important aspect of the approach is that Cartesian coordinates are used throughout. We devised an algorithm for the constrained sampling of initial conditions through the use of multivariate Gaussian distribution based on a projected Hessian. We also propose an approach for the constrained evaluation of the so-called Herman-Kluk prefactor in its exact log-derivative form. Sample calculations are performed for free and constrained rare-gas trimers. The results show that the proposed approach provides an accurate evaluation of the reduction in zero-point energy. Exact basis set calculations are used to assess the accuracy of the semiclassical results. Since Cartesian coordinates are used, the approach is general and applicable to a variety of molecular and atomic systems.

  13. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  14. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  15. Covariate-free and Covariate-dependent Reliability.

    PubMed

    Bentler, Peter M

    2016-12-01

    Classical test theory reliability coefficients are said to be population specific. Reliability generalization, a meta-analysis method, is the main procedure for evaluating the stability of reliability coefficients across populations. A new approach is developed to evaluate the degree of invariance of reliability coefficients to population characteristics. Factor or common variance of a reliability measure is partitioned into parts that are, and are not, influenced by control variables, resulting in a partition of reliability into a covariate-dependent and a covariate-free part. The approach can be implemented in a single sample and can be applied to a variety of reliability coefficients.

  16. Emulating the logic of monoterpenoid alkaloid biogenesis to access a skeletally diverse chemical library.

    PubMed

    Liu, Song; Scotti, John S; Kozmin, Sergey A

    2013-09-06

    We have developed a synthetic strategy that mimics the diversity-generating power of monoterpenoid indole alkaloid biosynthesis. Our general approach goes beyond diversification of a single natural product-like substructure and enables production of a highly diverse collection of small molecules. The reaction sequence begins with rapid and highly modular assembly of the tetracyclic indoloquinolizidine core, which can be chemoselectively processed into several additional skeletally diverse structural frameworks. The general utility of this approach was demonstrated by parallel synthesis of two representative chemical libraries containing 847 compounds with favorable physicochemical properties to enable its subsequent broad pharmacological evaluation.

  17. Evaluation of two cockpit display concepts for civil tiltrotor instrument operations on steep approaches

    NASA Technical Reports Server (NTRS)

    Decker, William A.; Bray, Richard S.; Simmons, Rickey C.; Tucker, George E.

    1993-01-01

    A piloted simulation experiment was conducted using the NASA Ames Research Center Vertical Motion Simulator to evaluate two cockpit display formats designed for manual control on steep instrument approaches for a civil transport tiltrotor aircraft. The first display included a four-cue (pitch, roll, power lever position, and nacelle angle movement prompt) flight director. The second display format provided instantaneous flight path angle information together with other symbols for terminal area guidance. Pilots evaluated these display formats for an instrument approach task which required a level flight conversion from airplane-mode flight to helicopter-mode flight while decelerating to the nominal approach airspeed. Pilots tracked glide slopes of 6, 9, 15 and 25 degrees, terminating in a hover for a vertical landing on a 150 feet square vertipad. Approaches were conducted with low visibility and ceilings and with crosswinds and turbulence, with all aircraft systems functioning normally and were carried through to a landing. Desired approach and tracking performance was achieved with generally satisfactory handling qualities using either display format on glide slopes up through 15 degrees. Evaluations with both display formats for a 25 degree glide slope revealed serious problems with glide slope tracking at low airspeeds in crosswinds and the loss of the intended landing spot from the cockpit field of view.

  18. Management of venous leg ulcers in general practice - a practical guideline.

    PubMed

    Sinha, Sankar; Sreedharan, Sadhishaan

    2014-09-01

    Chronic venous leg ulcers are the most common wounds seen in general practice. Their management can be both challenging and time-consuming. To produce a short practical guideline incorporating the TIME concept and A2BC2D approach to help general practitioners and their practice nurses in delivering evidence-based initial care to patients with chronic venous leg ulcers. Most chronic venous leg ulcers can be managed effectively in the general practice setting by following the simple, evidence-based approach described in this article. Figure 1 provides a flow chart to aid in this process. Figure 2 illustrates the principles of management in general practice. Effective management of chronic ulcers involves the assessment of both the ulcer and the patient. The essential requirements of management are to debride the ulcer with appropriate precautions, choose dressings that maintain adequate moisture balance, apply graduated compression bandage after evaluation of the arterial circulation and address the patient's concerns, such as pain and offensive wound discharge.

  19. Generic approach to access barriers in dehydrogenation reactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Liang; Vilella, Laia; Abild-Pedersen, Frank

    The introduction of linear energy correlations, which explicitly relate adsorption energies of reaction intermediates and activation energies in heterogeneous catalysis, has proven to be a key component in the computational search for new and promising catalysts. A simple linear approach to estimate activation energies still requires a significant computational effort. To simplify this process and at the same time incorporate the need for enhanced complexity of reaction intermediates, we generalize a recently proposed approach that evaluates transition state energies based entirely on bond-order conservation arguments. Here, we show that similar variation of the local electronic structure along the reaction coordinatemore » introduces a set of general functions that accurately defines the transition state energy and are transferable to other reactions with similar bonding nature. With such an approach, more complex reaction intermediates can be targeted with an insignificant increase in computational effort and without loss of accuracy.« less

  20. Generic approach to access barriers in dehydrogenation reactions

    DOE PAGES

    Yu, Liang; Vilella, Laia; Abild-Pedersen, Frank

    2018-03-08

    The introduction of linear energy correlations, which explicitly relate adsorption energies of reaction intermediates and activation energies in heterogeneous catalysis, has proven to be a key component in the computational search for new and promising catalysts. A simple linear approach to estimate activation energies still requires a significant computational effort. To simplify this process and at the same time incorporate the need for enhanced complexity of reaction intermediates, we generalize a recently proposed approach that evaluates transition state energies based entirely on bond-order conservation arguments. Here, we show that similar variation of the local electronic structure along the reaction coordinatemore » introduces a set of general functions that accurately defines the transition state energy and are transferable to other reactions with similar bonding nature. With such an approach, more complex reaction intermediates can be targeted with an insignificant increase in computational effort and without loss of accuracy.« less

  1. Evaluation of methods to calculate a wetlands water balance.

    DOT National Transportation Integrated Search

    2000-08-01

    The development of a workable approach to estimating mitigation site water budgets is a high priority for VDOT and the wetlands research and design community in general as they attempt to create successful mitigation sites. Additionally, correct soil...

  2. Large deviation approach to the generalized random energy model

    NASA Astrophysics Data System (ADS)

    Dorlas, T. C.; Dukes, W. M. B.

    2002-05-01

    The generalized random energy model is a generalization of the random energy model introduced by Derrida to mimic the ultrametric structure of the Parisi solution of the Sherrington-Kirkpatrick model of a spin glass. It was solved exactly in two special cases by Derrida and Gardner. A complete solution for the thermodynamics in the general case was given by Capocaccia et al. Here we use large deviation theory to analyse the model in a very straightforward way. We also show that the variational expression for the free energy can be evaluated easily using the Cauchy-Schwarz inequality.

  3. Violent video game effects on children and adolescents. A review of the literature.

    PubMed

    Gentile, D A; Stone, W

    2005-12-01

    Studies of violent video games on children and adolescents were reviewed to: 1) determine the multiple effects; 2) to offer critical observations about common strengths and weaknesses in the literature; 3) to provide a broader perspective to understand the research on the effects of video games. The review includes general theoretical and methodological considerations of media violence, and description of the general aggression model (GAM). The literature was evaluated in relation to the GAM. Published literature, including meta-analyses, are reviewed, as well as relevant unpublished material, such as conference papers and dissertations. Overall, the evidence supports hypotheses that violent video game play is related to aggressive affect, physiological arousal, aggressive cognitions, and aggressive behaviours. The effects of video game play on school performance are also evaluated, and the review concludes with a dimensional approach to video game effects. The dimensional approach evaluates video game effects in terms of amount, content, form, and mechanics, and appears to have many advantages for understanding and predicting the multiple types of effects demonstrated in the literature.

  4. International approaches to driving under the influence of cannabis: A review of evidence on impact.

    PubMed

    Watson, Tara Marie; Mann, Robert E

    2016-12-01

    There are knowledge gaps regarding the effectiveness of different approaches designed to prevent and deter driving under the influence of cannabis (DUIC). Policymakers are increasingly interested in evidence-based responses to DUIC as numerous jurisdictions worldwide have legally regulated cannabis or are debating such regulation. We contribute a comprehensive review of international literature on countermeasures that address DUIC, and identify where and how such measures have been evaluated. The following databases were systematically searched from 1995 to present: Medline, Embase, PsycINFO, CINAHL, Sociological Abstracts, and Criminal Justice Abstracts. Hand searching of relevant documents, internet searches for grey literature, and review of ongoing email alerts were conducted to capture any emerging literature and relevant trends. Numerous international jurisdictions have introduced a variety of measures designed to deter DUIC. Much interest has been generated regarding non-zero per se laws that set fixed legal limits for tetrahydrocannabinol and/or its metabolites detected in drivers. Other approaches include behavioural impairment laws, zero-tolerance per se laws, roadside drug testing, graduated licensing system restrictions, and remedial programs. However, very few evaluations have appeared in the literature. Although some promising results have been reported (e.g., roadside testing), it is premature to draw firm conclusions regarding the broader impacts of general deterrent approaches to DUIC. This review points to the need for a long-term commitment to rigorously evaluate, using multiple methods, the impact of general and specific deterrent DUIC countermeasures. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. A Spike Cocktail Approach to Improve Microbial Performance ...

    EPA Pesticide Factsheets

    Water reuse, via either centralized treatment of traditional wastewater or decentralized treatment and on-site reuse, is becoming an increasingly important element of sustainable water management. Despite advances in waterborne pathogen detection methods, low and highly variable pathogen levels limit their utility for routine evaluation of health risks in water reuse systems. Therefore, there is a need to improve our understanding of the linkage between pathogens and more readily measured process indicators during treatment. This paper describes an approach for constructing spiking experiments to relate the behavior of viral, bacterial, and protozoan pathogens with relevant process indicators. General issues are reviewed, and the spiking protocol is applied as a case study example to improve microbial performance monitoring and health risk evaluation in a water reuse system. This approach provides a foundation for the development of novel approaches to improve real or near-real time performance monitoring of water recycling systems. This manuscrupt details an approach for developing "spike cocktail", a mixture of microorganisms that can be used to evaluate the performance of engineered and natural systems.

  6. Analysis and flight evaluation of a small, fixed-wing aircraft equipped with hinged plate spoilers

    NASA Technical Reports Server (NTRS)

    Olcott, J. W.; Sackel, E.; Ellis, D. R.

    1981-01-01

    The results of a four phase effort to evaluate the application of hinged plate spoilers/dive brakes to a small general aviation aircraft are presented. The test vehicle was a single engine light aircraft modified with an experimental set of upper surface spoilers and lower surface dive brakes similar to the type used on sailplanes. The lift, drag, stick free stability, trim, and dynamic response characteristics of four different spoiler/dive brake configurations were determined. Tests also were conducted, under a wide range of flight conditions and with pilots of various experience levels, to determine the most favorable methods of spoiler control and to evaluate how spoilers might best be used during the approach and landing task. The effects of approach path angle, approach airspeed, and pilot technique using throttle/spoiler integrated control were investigated for day, night, VFR, and IFR approaches and landings. The test results indicated that spoilers offered significant improvements in the vehicle's performance and flying qualities for all elements of the approach and landing task, provided a suitable method of control was available.

  7. A Generalized Hybrid Multiscale Modeling Approach for Flow and Reactive Transport in Porous Media

    NASA Astrophysics Data System (ADS)

    Yang, X.; Meng, X.; Tang, Y. H.; Guo, Z.; Karniadakis, G. E.

    2017-12-01

    Using emerging understanding of biological and environmental processes at fundamental scales to advance predictions of the larger system behavior requires the development of multiscale approaches, and there is strong interest in coupling models at different scales together in a hybrid multiscale simulation framework. A limited number of hybrid multiscale simulation methods have been developed for subsurface applications, mostly using application-specific approaches for model coupling. The proposed generalized hybrid multiscale approach is designed with minimal intrusiveness to the at-scale simulators (pre-selected) and provides a set of lightweight C++ scripts to manage a complex multiscale workflow utilizing a concurrent coupling approach. The workflow includes at-scale simulators (using the lattice-Boltzmann method, LBM, at the pore and Darcy scale, respectively), scripts for boundary treatment (coupling and kriging), and a multiscale universal interface (MUI) for data exchange. The current study aims to apply the generalized hybrid multiscale modeling approach to couple pore- and Darcy-scale models for flow and mixing-controlled reaction with precipitation/dissolution in heterogeneous porous media. The model domain is packed heterogeneously that the mixing front geometry is more complex and not known a priori. To address those challenges, the generalized hybrid multiscale modeling approach is further developed to 1) adaptively define the locations of pore-scale subdomains, 2) provide a suite of physical boundary coupling schemes and 3) consider the dynamic change of the pore structures due to mineral precipitation/dissolution. The results are validated and evaluated by comparing with single-scale simulations in terms of velocities, reactive concentrations and computing cost.

  8. A Study of General Education Astronomy Students' Understandings of Cosmology. Part III. Evaluating Four Conceptual Cosmology Surveys: An Item Response Theory Approach

    ERIC Educational Resources Information Center

    Wallace, Colin S.; Prather, Edward E.; Duncan, Douglas K.

    2012-01-01

    This is the third of five papers detailing our national study of general education astronomy students' conceptual and reasoning difficulties with cosmology. In this paper, we use item response theory to analyze students' responses to three out of the four conceptual cosmology surveys we developed. The specific item response theory model we use is…

  9. Evaluating mobile phone applications for health behaviour change: A systematic review.

    PubMed

    McKay, Fiona H; Cheng, Christina; Wright, Annemarie; Shill, Jane; Stephens, Hugh; Uccellini, Mary

    2018-01-01

    Introduction Increasing smartphones access has allowed for increasing development and use of smart phone applications (apps). Mobile health interventions have previously relied on voice or text-based short message services (SMS), however, the increasing availability and ease of use of apps has allowed for significant growth of smartphone apps that can be used for health behaviour change. This review considers the current body of knowledge relating to the evaluation of apps for health behaviour change. The aim of this review is to investigate approaches to the evaluation of health apps to identify any current best practice approaches. Method A systematic review was conducted. Data were collected and analysed in September 2016. Thirty-eight articles were identified and have been included in this review. Results Articles were published between 2011- 2016, and 36 were reviews or evaluations of apps related to one or more health conditions, the remaining two reported on an investigation of the usability of health apps. Studies investigated apps relating to the following areas: alcohol, asthma, breastfeeding, cancer, depression, diabetes, general health and fitness, headaches, heart disease, HIV, hypertension, iron deficiency/anaemia, low vision, mindfulness, obesity, pain, physical activity, smoking, weight management and women's health. Conclusion In order to harness the potential of mobile health apps for behaviour change and health, we need better ways to assess the quality and effectiveness of apps. This review is unable to suggest a single best practice approach to evaluate mobile health apps. Few measures identified in this review included sufficient information or evaluation, leading to potentially incomplete and inaccurate information for consumers seeking the best app for their situation. This is further complicated by a lack of regulation in health promotion generally.

  10. Toward an agenda for evaluation of qualitative research.

    PubMed

    Stige, Brynjulf; Malterud, Kirsti; Midtgarden, Torjus

    2009-10-01

    Evaluation is essential for research quality and development, but the diversity of traditions that characterize qualitative research suggests that general checklists or shared criteria for evaluation are problematic. We propose an approach to research evaluation that encourages reflexive dialogue through use of an evaluation agenda. In proposing an evaluation agenda we shift attention from rule-based judgment to reflexive dialogue. Unlike criteria, an agenda may embrace pluralism, and does not request consensus on ontological, epistemological, and methodological issues, only consensus on what themes warrant discussion. We suggest an evaluation agenda-EPICURE-with two dimensions communicated through use of two acronyms.The first, EPIC, refers to the challenge of producing rich and substantive accounts based on engagement, processing, interpretation, and (self-)critique. The second-CURE-refers to the challenge of dealing with preconditions and consequences of research, with a focus on (social) critique, usefulness, relevance, and ethics. The seven items of the composite agenda EPICURE are presented and exemplified. Features and implications of the agenda approach to research evaluation are then discussed.

  11. A real-world approach to Evidence-Based Medicine in general practice: a competency framework derived from a systematic review and Delphi process.

    PubMed

    Galbraith, Kevin; Ward, Alison; Heneghan, Carl

    2017-05-03

    Evidence-Based Medicine (EBM) skills have been included in general practice curricula and competency frameworks. However, GPs experience numerous barriers to developing and maintaining EBM skills, and some GPs feel the EBM movement misunderstands, and threatens their traditional role. We therefore need a new approach that acknowledges the constraints encountered in real-world general practice. The aim of this study was to synthesise from empirical research a real-world EBM competency framework for general practice, which could be applied in training, in the individual pursuit of continuing professional development, and in routine care. We sought to integrate evidence from the literature with evidence derived from the opinions of experts in the fields of general practice and EBM. We synthesised two sets of themes describing the meaning of EBM in general practice. One set of themes was derived from a mixed-methods systematic review of the literature; the other set was derived from the further development of those themes using a Delphi process among a panel of EBM and general practice experts. From these two sets of themes we constructed a real-world EBM competency framework for general practice. A simple competency framework was constructed, that acknowledges the constraints of real-world general practice: (1) mindfulness - in one's approach towards EBM itself, and to the influences on decision-making; (2) pragmatism - in one's approach to finding and evaluating evidence; and (3) knowledge of the patient - as the most useful resource in effective communication of evidence. We present a clinical scenario to illustrate how a GP might demonstrate these competencies in their routine daily work. We have proposed a real-world EBM competency framework for general practice, derived from empirical research, which acknowledges the constraints encountered in modern general practice. Further validation of these competencies is required, both as an educational resource and as a strategy for actual practice.

  12. Changing physician behavior: what works?

    PubMed

    Mostofian, Fargoi; Ruban, Cynthiya; Simunovic, Nicole; Bhandari, Mohit

    2015-01-01

    There are various interventions for guideline implementation in clinical practice, but the effects of these interventions are generally unclear. We conducted a systematic review to identify effective methods of implementing clinical research findings and clinical guidelines to change physician practice patterns, in surgical and general practice. Systematic review of reviews. We searched electronic databases (MEDLINE, EMBASE, and PubMed) for systematic reviews published in English that evaluated the effectiveness of different implementation methods. Two reviewers independently assessed eligibility for inclusion and methodological quality, and extracted relevant data. Fourteen reviews covering a wide range of interventions were identified. The intervention methods used include: audit and feedback, computerized decision support systems, continuing medical education, financial incentives, local opinion leaders, marketing, passive dissemination of information, patient-mediated interventions, reminders, and multifaceted interventions. Active approaches, such as academic detailing, led to greater effects than traditional passive approaches. According to the findings of 3 reviews, 71% of studies included in these reviews showed positive change in physician behavior when exposed to active educational methods and multifaceted interventions. Active forms of continuing medical education and multifaceted interventions were found to be the most effective methods for implementing guidelines into general practice. Additionally, active approaches to changing physician performance were shown to improve practice to a greater extent than traditional passive methods. Further primary research is necessary to evaluate the effectiveness of these methods in a surgical setting.

  13. Best practices for evaluating the capability of nondestructive evaluation (NDE) and structural health monitoring (SHM) techniques for damage characterization

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Annis, Charles; Sabbagh, Harold A.; Lindgren, Eric A.

    2016-02-01

    A comprehensive approach to NDE and SHM characterization error (CE) evaluation is presented that follows the framework of the `ahat-versus-a' regression analysis for POD assessment. Characterization capability evaluation is typically more complex with respect to current POD evaluations and thus requires engineering and statistical expertise in the model-building process to ensure all key effects and interactions are addressed. Justifying the statistical model choice with underlying assumptions is key. Several sizing case studies are presented with detailed evaluations of the most appropriate statistical model for each data set. The use of a model-assisted approach is introduced to help assess the reliability of NDE and SHM characterization capability under a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and vibration-based SHM case studies. The results of these studies highlight the general protocol feasibility, emphasize the importance of evaluating key application characteristics prior to the study, and demonstrate an approach to quantify the role of varying SHM sensor durability and environmental conditions on characterization performance.

  14. Corporate incentives for promoting safety belt use : rationale, guidelines, and examples

    DOT National Transportation Integrated Search

    1982-10-01

    This manual was designed to teach the corporate executive successful strategies for implementing and evaluating a successful industry-based program to motivate employee safety belt use. A rationale is given for the general approach; and specific guid...

  15. General Aviation Flight Test of Advanced Operations Enabled by Synthetic Vision

    NASA Technical Reports Server (NTRS)

    Glaab, Louis J.; Hughhes, Monica F.; Parrish, Russell V.; Takallu, Mohammad A.

    2014-01-01

    A flight test was performed to compare the use of three advanced primary flight and navigation display concepts to a baseline, round-dial concept to assess the potential for advanced operations. The displays were evaluated during visual and instrument approach procedures including an advanced instrument approach resembling a visual airport traffic pattern. Nineteen pilots from three pilot groups, reflecting the diverse piloting skills of the General Aviation pilot population, served as evaluation subjects. The experiment had two thrusts: 1) an examination of the capabilities of low-time (i.e., <400 hours), non-instrument-rated pilots to perform nominal instrument approaches, and 2) an exploration of potential advanced Visual Meteorological Conditions (VMC)-like approaches in Instrument Meteorological Conditions (IMC). Within this context, advanced display concepts are considered to include integrated navigation and primary flight displays with either aircraft attitude flight directors or Highway In The Sky (HITS) guidance with and without a synthetic depiction of the external visuals (i.e., synthetic vision). Relative to the first thrust, the results indicate that using an advanced display concept, as tested herein, low-time, non-instrument-rated pilots can exhibit flight-technical performance, subjective workload and situation awareness ratings as good as or better than high-time Instrument Flight Rules (IFR)-rated pilots using Baseline Round Dials for a nominal IMC approach. For the second thrust, the results indicate advanced VMC-like approaches are feasible in IMC, for all pilot groups tested for only the Synthetic Vision System (SVS) advanced display concept.

  16. Uranium disequilibrium in groundwater: An isotope dilution approach in hydrologic investigations

    USGS Publications Warehouse

    Osmond, J.K.; Rydell, H.S.; Kaufman, M.I.

    1968-01-01

    The distribution and environmental disequilibrium patterns of naturally occurring uranium isotopes (U234 and U238) in waters of the Floridan aquifer suggest that variations in the ratios of isotopic activity and concentrations can be used quantitatively to evaluate mixing proportions of waters from differing sources. Uranium is probably unique in its potential for this approach, which seems to have general usefulness in hydrologic investigations.

  17. On the Role of Impact Evaluation of Quality Assurance from the Strategic Perspective of Quality Assurance Agencies in the European Higher Education Area

    ERIC Educational Resources Information Center

    Damian, Radu; Grifoll, Josep; Rigbers, Anke

    2015-01-01

    In this paper the current national legislations, the quality assurance approaches and the activities of impact analysis of three quality assurance agencies from Romania, Spain and Germany are described from a strategic perspective. The analysis shows that the general methodologies (comprising, for example, self-evaluation reports, peer reviews,…

  18. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  19. Contrasting motivational orientation and evaluative coding accounts: on the need to differentiate the effectors of approach/avoidance responses.

    PubMed

    Kozlik, Julia; Neumann, Roland; Lozo, Ljubica

    2015-01-01

    Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus-response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus-response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S-R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed.

  20. Contrasting motivational orientation and evaluative coding accounts: on the need to differentiate the effectors of approach/avoidance responses

    PubMed Central

    Kozlik, Julia; Neumann, Roland; Lozo, Ljubica

    2015-01-01

    Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus–response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus–response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S–R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed. PMID:25983718

  1. A novel feature extraction approach for microarray data based on multi-algorithm fusion

    PubMed Central

    Jiang, Zhu; Xu, Rong

    2015-01-01

    Feature extraction is one of the most important and effective method to reduce dimension in data mining, with emerging of high dimensional data such as microarray gene expression data. Feature extraction for gene selection, mainly serves two purposes. One is to identify certain disease-related genes. The other is to find a compact set of discriminative genes to build a pattern classifier with reduced complexity and improved generalization capabilities. Depending on the purpose of gene selection, two types of feature extraction algorithms including ranking-based feature extraction and set-based feature extraction are employed in microarray gene expression data analysis. In ranking-based feature extraction, features are evaluated on an individual basis, without considering inter-relationship between features in general, while set-based feature extraction evaluates features based on their role in a feature set by taking into account dependency between features. Just as learning methods, feature extraction has a problem in its generalization ability, which is robustness. However, the issue of robustness is often overlooked in feature extraction. In order to improve the accuracy and robustness of feature extraction for microarray data, a novel approach based on multi-algorithm fusion is proposed. By fusing different types of feature extraction algorithms to select the feature from the samples set, the proposed approach is able to improve feature extraction performance. The new approach is tested against gene expression dataset including Colon cancer data, CNS data, DLBCL data, and Leukemia data. The testing results show that the performance of this algorithm is better than existing solutions. PMID:25780277

  2. A novel feature extraction approach for microarray data based on multi-algorithm fusion.

    PubMed

    Jiang, Zhu; Xu, Rong

    2015-01-01

    Feature extraction is one of the most important and effective method to reduce dimension in data mining, with emerging of high dimensional data such as microarray gene expression data. Feature extraction for gene selection, mainly serves two purposes. One is to identify certain disease-related genes. The other is to find a compact set of discriminative genes to build a pattern classifier with reduced complexity and improved generalization capabilities. Depending on the purpose of gene selection, two types of feature extraction algorithms including ranking-based feature extraction and set-based feature extraction are employed in microarray gene expression data analysis. In ranking-based feature extraction, features are evaluated on an individual basis, without considering inter-relationship between features in general, while set-based feature extraction evaluates features based on their role in a feature set by taking into account dependency between features. Just as learning methods, feature extraction has a problem in its generalization ability, which is robustness. However, the issue of robustness is often overlooked in feature extraction. In order to improve the accuracy and robustness of feature extraction for microarray data, a novel approach based on multi-algorithm fusion is proposed. By fusing different types of feature extraction algorithms to select the feature from the samples set, the proposed approach is able to improve feature extraction performance. The new approach is tested against gene expression dataset including Colon cancer data, CNS data, DLBCL data, and Leukemia data. The testing results show that the performance of this algorithm is better than existing solutions.

  3. Identifying the critical financial ratios for stocks evaluation: A fuzzy delphi approach

    NASA Astrophysics Data System (ADS)

    Mokhtar, Mazura; Shuib, Adibah; Mohamad, Daud

    2014-12-01

    Stocks evaluation has always been an interesting and challenging problem for both researchers and practitioners. Generally, the evaluation can be made based on a set of financial ratios. Nevertheless, there are a variety of financial ratios that can be considered and if all ratios in the set are placed into the evaluation process, data collection would be more difficult and time consuming. Thus, the objective of this paper is to identify the most important financial ratios upon which to focus in order to evaluate the stock's performance. For this purpose, a survey was carried out using an approach which is based on an expert judgement, namely the Fuzzy Delphi Method (FDM). The results of this study indicated that return on equity, return on assets, net profit margin, operating profit margin, earnings per share and debt to equity are the most important ratios.

  4. Veterinary and human vaccine evaluation methods

    PubMed Central

    Knight-Jones, T. J. D.; Edmond, K.; Gubbins, S.; Paton, D. J.

    2014-01-01

    Despite the universal importance of vaccines, approaches to human and veterinary vaccine evaluation differ markedly. For human vaccines, vaccine efficacy is the proportion of vaccinated individuals protected by the vaccine against a defined outcome under ideal conditions, whereas for veterinary vaccines the term is used for a range of measures of vaccine protection. The evaluation of vaccine effectiveness, vaccine protection assessed under routine programme conditions, is largely limited to human vaccines. Challenge studies under controlled conditions and sero-conversion studies are widely used when evaluating veterinary vaccines, whereas human vaccines are generally evaluated in terms of protection against natural challenge assessed in trials or post-marketing observational studies. Although challenge studies provide a standardized platform on which to compare different vaccines, they do not capture the variation that occurs under field conditions. Field studies of vaccine effectiveness are needed to assess the performance of a vaccination programme. However, if vaccination is performed without central co-ordination, as is often the case for veterinary vaccines, evaluation will be limited. This paper reviews approaches to veterinary vaccine evaluation in comparison to evaluation methods used for human vaccines. Foot-and-mouth disease has been used to illustrate the veterinary approach. Recommendations are made for standardization of terminology and for rigorous evaluation of veterinary vaccines. PMID:24741009

  5. Washington Phase II Fish Diversion Screen Evaluations in the Yakima and Touchet River Basins, 2005-2006 Annual Reports.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamness, Mickie; Abernethy, C.; Tunnicliffe, Cherylyn

    2006-02-01

    In 2005, Pacific Northwest National Laboratory (PNNL) researchers evaluated 25 Phase II fish screen sites in the Yakima and Touchet river basins. Pacific Northwest National Laboratory performs these evaluations for Bonneville Power Administration (BPA) to determine whether the fish screening devices meet National Marine Fisheries Service (NMFS) criteria to promote safe and timely fish passage. Evaluations consist of measuring velocities in front of the screens, using an underwater camera to look at the condition and environment in front of the screens, and noting the general condition and operation of the sites. Results of the evaluations in 2005 include the following:more » (1) Most approach velocities met the NMFS criterion of less than or equal to 0.4 fps. Less than 13% of all approach measurements exceeded the criterion, and these occurred at 10 of the sites. Flat-plate screens had more problems than drum screens with high approach velocities. (2) Bypass velocities generally were greater than sweep velocities, but sweep velocities often did not increase toward the bypass. The latter condition could slow migration of fish through the facility. (3) Screen and seal materials generally were in good condition. (4) Automated cleaning brushes generally functioned properly; chains and other moving parts were typically well-greased and operative. (5) Washington Department of Fish and Wildlife (WDFW) and U.S. Bureau of Reclamation (USBR) generally operate and maintain fish screen facilities in a way that provides safe passage for juvenile fish. (6) In some instances, irrigators responsible for specific maintenance at their sites (e.g., debris removal) are not performing their tasks in a way that provides optimum operation of the fish screen facility. New ways need to be found to encourage them to maintain their facilities properly. (7) We recommend placing datasheets providing up-to-date operating criteria and design flows in each sites logbox. The datasheet should include bypass design flows and a table showing depths of water over the weir and corresponding bypass flow. This information is available at some of the sites but may be outdated. These data are used to determine if the site is running within design criteria. (8) Modifying use of debris control plates at Gleed helped minimize the extreme fluctuations in flow, but approach velocities are still too high. Other ways to reduce the approach velocities need to be tried, possibly including redesign of the site. (9) Alternatives to a screen site at Taylor should be considered. A lot of effort was spent trying to increase water to the site, but it still was unable to operate within NMFS criteria for most of the year and may be a hazard to juvenile salmonids. We conclude that the conditions at most of the Phase II fish screen facilities we evaluated in 2005 would be expected to provide safe passage for juvenile fish. For those sites where conditions are not always optimum for safe fish passage, PNNL researchers will try to coordinate with the WDFW and USBR in 2006 to find solutions to the problems. Some of those problems are consistently high approach velocities at specific sites, including Congdon, Naches-Selah, Union Gap, and Yakima-Tieton. We would like to be able to monitor changes in velocities as soon as operations and maintenance personnel adjust the louvers or porosity boards at these sites. This will give them immediate feedback on the results of their modifications and allow additional adjustments as necessary until the conditions meet NMFS criteria. Pacific Northwest National Laboratory has performed evaluations at many of these sites over the past 8 years, providing information WDFW and USBR personnel can use to perform their operations and maintenance more effectively. Consequently, overall effectiveness of the screens facilities has improved over time.« less

  6. [Robotic fundoplication for gastro-oesophageal reflux disease].

    PubMed

    Costi, Renato; Himpens, Jacques; Iusco, Domenico; Sarli, Leopoldo; Violi, Vincenzo; Roncoroni, Luigi; Cadière, Guy Bernard

    2004-01-01

    Presented as a possible "second" revolution in general surgery after the introduction of laparoscopy during the last few years, the robotic approach to mini-invasive surgery has not yet witnessed wide, large-scale diffusion among general surgeons and is still considered an "experimental approach". In general surgery, the laparoscopic treatment of gastrooesophageal reflux is the second most frequently performed robot-assisted procedure after cholecystectomy. A review of the literature and an analysis of the costs may allow a preliminary evaluation of the pros and cons of robotic fundoplication, which may then be applicable to other general surgery procedures. Eleven articles report 91 cases of robotic fundoplication (75 Nissen, 9 Thal, 7 Toupet). To date, there is no evidence of benefit in terms of duration of surgery, rate of complications and hospital stay. Moreover, robotic fundoplication is more expensive than the traditional laparoscopic approach (the additional cost per procedure due to robotics is 1,882.97 euros). Only further technological upgrades and advances will make the use of robotics competitive in general surgery. The development of multi-functional instruments and of tactile feedback at the console, enlargement of the three-dimensional laparoscopic view and specific "team" training will enable the use of robotic surgery to be extended to increasingly difficult procedures and to non-specialised environments.

  7. Evaluation of Three Instructional Methods for Teaching General Chemistry.

    ERIC Educational Resources Information Center

    Jackman, Lance E.; And Others

    1987-01-01

    Reports on a study designed to determine the relative effectiveness of different instructional approaches on chemistry laboratory achievement. Investigated differences in achievement in spectrophotometry among college freshmen who received either traditional, learning cycle, or computer simulation instruction. Results indicated that students…

  8. Development and evaluation of consensus-based sediment effect concentrations for polychlorinated biphenyls

    USGS Publications Warehouse

    MacDonald, Donald D.; Dipinto, Lisa M.; Field, Jay; Ingersoll, Christopher G.; Long, Edward R.; Swartz, Richard C.

    2000-01-01

    Sediment-quality guidelines (SQGs) have been published for polychlorinated biphenyls (PCBs) using both empirical and theoretical approaches. Empirically based guidelines have been developed using the screening-level concentration, effects range, effects level, and apparent effects threshold approaches. Theoretically based guidelines have been developed using the equilibrium-partitioning approach. Empirically-based guidelines were classified into three general categories, in accordance with their original narrative intents, and used to develop three consensus-based sediment effect concentrations (SECs) for total PCBs (tPCBs), including a threshold effect concentration, a midrange effect concentration, and an extreme effect concentration. Consensus-based SECs were derived because they estimate the central tendency of the published SQGs and, thus, reconcile the guidance values that have been derived using various approaches. Initially, consensus-based SECs for tPCBs were developed separately for freshwater sediments and for marine and estuarine sediments. Because the respective SECs were statistically similar, the underlying SQGs were subsequently merged and used to formulate more generally applicable SECs. The three consensus-based SECs were then evaluated for reliability using matching sediment chemistry and toxicity data from field studies, dose-response data from spiked-sediment toxicity tests, and SQGs derived from the equilibrium-partitioning approach. The results of this evaluation demonstrated that the consensus-based SECs can accurately predict both the presence and absence of toxicity in field-collected sediments. Importantly, the incidence of toxicity increases incrementally with increasing concentrations of tPCBs. Moreover, the consensus-based SECs are comparable to the chronic toxicity thresholds that have been estimated from dose-response data and equilibrium-partitioning models. Therefore, consensus-based SECs provide a unifying synthesis of existing SQGs, reflect causal rather than correlative effects, and accurately predict sediment toxicity in PCB-contaminated sediments.

  9. Direct Linearization and Adjoint Approaches to Evaluation of Atmospheric Weighting Functions and Surface Partial Derivatives: General Principles, Synergy and Areas of Application

    NASA Technical Reports Server (NTRS)

    Ustino, Eugene A.

    2006-01-01

    This slide presentation reviews the observable radiances as functions of atmospheric parameters and of surface parameters; the mathematics of atmospheric weighting functions (WFs) and surface partial derivatives (PDs) are presented; and the equation of the forward radiative transfer (RT) problem is presented. For non-scattering atmospheres this can be done analytically, and all WFs and PDs can be computed analytically using the direct linearization approach. For scattering atmospheres, in general case, the solution of the forward RT problem can be obtained only numerically, but we need only two numerical solutions: one of the forward RT problem and one of the adjoint RT problem to compute all WFs and PDs we can think of. In this presentation we discuss applications of both the linearization and adjoint approaches

  10. A fractional approach to the Fermi-Pasta-Ulam problem

    NASA Astrophysics Data System (ADS)

    Machado, J. A. T.

    2013-09-01

    This paper studies the Fermi-Pasta-Ulam problem having in mind the generalization provided by Fractional Calculus (FC). The study starts by addressing the classical formulation, based on the standard integer order differential calculus and evaluates the time and frequency responses. A first generalization to be investigated consists in the direct replacement of the springs by fractional elements of the dissipative type. It is observed that the responses settle rapidly and no relevant phenomena occur. A second approach consists of replacing the springs by a blend of energy extracting and energy inserting elements of symmetrical fractional order with amplitude modulated by quadratic terms. The numerical results reveal a response close to chaotic behaviour.

  11. A head-up display format for transport aircraft approach and landing

    NASA Technical Reports Server (NTRS)

    Bray, R. S.; Scott, B. C.

    1981-01-01

    An electronic flight-guidance display format was designed for use in evaluations of the collimated head-up display concept applied to transport aircraft landing. In the design process of iterative evaluation and modification, some general principles, or guidelines, applicable to electronic flight displays were suggested. The usefulness of an indication of instantaneous inertial flightpath was clearly demonstrated. Evaluator pilot acceptance of the unfamiliar display concepts was very positive when careful attention was given to indoctrination and training.

  12. Using Eight Key Questions as an Inquiry-Based Framework for Ethical Reasoning Issues in a General Education Earth Systems and Climate Change Course

    NASA Astrophysics Data System (ADS)

    Johnson, E. A.; Ball, T. C.

    2014-12-01

    An important objective in general education geoscience courses is to help students evaluate social and ethical issues based upon scientific knowledge. It can be difficult for instructors trained in the physical sciences to design effective ways of including ethical issues in large lecture courses where whole-class discussions are not practical. The Quality Enhancement Plan for James Madison University, "The Madison Collaborative: Ethical Reasoning in Action," (http://www.jmu.edu/mc/index.shtml) has identified eight key questions to be used as a framework for developing ethical reasoning exercises and evaluating student learning. These eight questions are represented by the acronym FOR CLEAR and are represented by the concepts of Fairness, Outcomes, Responsibilities, Character, Liberty, Empathy, Authority, and Rights. In this study, we use the eight key questions as an inquiry-based framework for addressing ethical issues in a 100-student general education Earth systems and climate change course. Ethical reasoning exercises are presented throughout the course and range from questions of personal behavior to issues regarding potential future generations and global natural resources. In the first few exercises, key questions are identified for the students and calibrated responses are provided as examples. By the end of the semester, students are expected to identify key questions themselves and justify their own ethical and scientific reasoning. Evaluation rubrics are customized to this scaffolding approach to the exercises. Student feedback and course data will be presented to encourage discussion of this and other approaches to explicitly incorporating ethical reasoning in general education geoscience courses.

  13. An evaluation of three statistical estimation methods for assessing health policy effects on prescription drug claims.

    PubMed

    Mittal, Manish; Harrison, Donald L; Thompson, David M; Miller, Michael J; Farmer, Kevin C; Ng, Yu-Tze

    2016-01-01

    While the choice of analytical approach affects study results and their interpretation, there is no consensus to guide the choice of statistical approaches to evaluate public health policy change. This study compared and contrasted three statistical estimation procedures in the assessment of a U.S. Food and Drug Administration (FDA) suicidality warning, communicated in January 2008 and implemented in May 2009, on antiepileptic drug (AED) prescription claims. Longitudinal designs were utilized to evaluate Oklahoma (U.S. State) Medicaid claim data from January 2006 through December 2009. The study included 9289 continuously eligible individuals with prevalent diagnoses of epilepsy and/or psychiatric disorder. Segmented regression models using three estimation procedures [i.e., generalized linear models (GLM), generalized estimation equations (GEE), and generalized linear mixed models (GLMM)] were used to estimate trends of AED prescription claims across three time periods: before (January 2006-January 2008); during (February 2008-May 2009); and after (June 2009-December 2009) the FDA warning. All three statistical procedures estimated an increasing trend (P < 0.0001) in AED prescription claims before the FDA warning period. No procedures detected a significant change in trend during (GLM: -30.0%, 99% CI: -60.0% to 10.0%; GEE: -20.0%, 99% CI: -70.0% to 30.0%; GLMM: -23.5%, 99% CI: -58.8% to 1.2%) and after (GLM: 50.0%, 99% CI: -70.0% to 160.0%; GEE: 80.0%, 99% CI: -20.0% to 200.0%; GLMM: 47.1%, 99% CI: -41.2% to 135.3%) the FDA warning when compared to pre-warning period. Although the three procedures provided consistent inferences, the GEE and GLMM approaches accounted appropriately for correlation. Further, marginal models estimated using GEE produced more robust and valid population-level estimations. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Terrain modeling for microwave landing system

    NASA Technical Reports Server (NTRS)

    Poulose, M. M.

    1991-01-01

    A powerful analytical approach for evaluating the terrain effects on a microwave landing system (MLS) is presented. The approach combines a multiplate model with a powerful and exhaustive ray tracing technique and an accurate formulation for estimating the electromagnetic fields due to the antenna array in the presence of terrain. Both uniform theory of diffraction (UTD) and impedance UTD techniques have been employed to evaluate these fields. Innovative techniques are introduced at each stage to make the model versatile to handle most general terrain contours and also to reduce the computational requirement to a minimum. The model is applied to several terrain geometries, and the results are discussed.

  15. Designs of Empirical Evaluations of Nonexperimental Methods in Field Settings.

    PubMed

    Wong, Vivian C; Steiner, Peter M

    2018-01-01

    Over the last three decades, a research design has emerged to evaluate the performance of nonexperimental (NE) designs and design features in field settings. It is called the within-study comparison (WSC) approach or the design replication study. In the traditional WSC design, treatment effects from a randomized experiment are compared to those produced by an NE approach that shares the same target population. The nonexperiment may be a quasi-experimental design, such as a regression-discontinuity or an interrupted time-series design, or an observational study approach that includes matching methods, standard regression adjustments, and difference-in-differences methods. The goals of the WSC are to determine whether the nonexperiment can replicate results from a randomized experiment (which provides the causal benchmark estimate), and the contexts and conditions under which these methods work in practice. This article presents a coherent theory of the design and implementation of WSCs for evaluating NE methods. It introduces and identifies the multiple purposes of WSCs, required design components, common threats to validity, design variants, and causal estimands of interest in WSCs. It highlights two general approaches for empirical evaluations of methods in field settings, WSC designs with independent and dependent benchmark and NE arms. This article highlights advantages and disadvantages for each approach, and conditions and contexts under which each approach is optimal for addressing methodological questions.

  16. A Spike Cocktail Approach to Improve Microbial Performance Monitoring for Water Reuse.

    PubMed

    Zimmerman, Brian D; Korajkic, Asja; Brinkman, Nichole E; Grimm, Ann C; Ashbolt, Nicholas J; Garland, Jay L

      Water reuse, via either centralized treatment of traditional wastewater or decentralized treatment and on-site reuse, is becoming an increasingly important element of sustainable water management. Despite advances in waterborne pathogen detection methods, low and highly variable pathogen levels limit their utility for routine evaluation of health risks in water reuse systems. Therefore, there is a need to improve our understanding of the linkage between pathogens and more readily measured process indicators during treatment. This paper describes an approach for constructing spiking experiments to relate the behavior of viral, bacterial, and protozoan pathogens with relevant process indicators. General issues are reviewed, and the spiking protocol is applied as a case study example to improve microbial performance monitoring and health risk evaluation in a water reuse system. This approach provides a foundation for the development of novel approaches to improve real or near-real time performance monitoring of water recycling systems.

  17. Guidance law simulation studies for complex approaches using the Microwave Landing System (MLS)

    NASA Technical Reports Server (NTRS)

    Feather, J. B.

    1986-01-01

    This report documents results for MLS guidance algorithm development conducted by DAC for NASA under the Advance Transport Operating Systems (ATOPS) Technology Studies program (NAS1-18028). The study consisted of evaluating guidance laws for vertical and lateral path control, as well as speed control, by simulating an MLS approach for the Washington National Airport. This work is an extension and generalization of a previous ATOPS contract (NAS1-16202) completed by DAC in 1985. The Washington river approach was simulated by six waypoints and one glideslope change and consisted of an eleven nautical mile approach path. Tracking performance was generated for 10 cases representing several different conditions, which included MLS noise, steady wind, turbulence, and windshear. Results of this simulation phase are suitable for use in future fixed-base simulator evaluations employing actual hardware (autopilot and a performance management system), as well as crew procedures and information requirements for MLS.

  18. Simulator evaluation of the effects of reduced spoiler and thrust authority on a decoupled longitudinal control system during landings in wind shear

    NASA Technical Reports Server (NTRS)

    Miller, G. K., Jr.

    1981-01-01

    The effect of reduced control authority, both in symmetric spoiler travel and thrust level, on the effectiveness of a decoupled longitudinal control system was examined during the approach and landing of the NASA terminal configured vehicle (TCV) aft flight deck simulator in the presence of wind shear. The evaluation was conducted in a fixed-base simulator that represented the TCV aft cockpit. There were no statistically significant effects of reduced spoiler and thrust authority on pilot performance during approach and landing. Increased wind severity degraded approach and landing performance by an amount that was often significant. However, every attempted landing was completed safely regardless of the wind severity. There were statistically significant differences in performance between subjects, but the differences were generally restricted to the control wheel and control-column activity during the approach.

  19. Is routine alcohol screening and brief intervention feasible in a New Zealand primary care environment.

    PubMed

    Gifford, Heather; Paton, Sue; Cvitanovic, Lynley; McMenamin, John; Newton, Chloe

    2012-05-11

    To test the feasibility of a systemised ABC alcohol screening and brief intervention (SBI) approach in general practice in a New Zealand region. Data were collected on patients over 15 years who had their alcohol status recorded using the AUDIT tool. A concurrent independent process evaluation was conducted to assess effectiveness of ABC alcohol SBI related training and implementation of intervention. In an 8-month period, general practices in the Whanganui region documented alcohol consumption of 43% of their patients. Of the 43% of patients screened 24% were drinking contrary to ALAC's low risk drinking advice. Of these, 36% received brief advice or referral. Success of the approach can be attributed to the use of the Patient Dashboard reminder software and linked alcohol recording form. Other success factors included the use of a clinical champion and project leader, education and training, funding for extra GP and nurse assessment time and linking of the ABC alcohol SBI approach to existing services. Primary care in Whanganui has demonstrated the capacity to routinely query patient alcohol use and offer brief advice. If the approach was more widely adopted, there is considerable scope for general practice nationally to address potentially harmful patient alcohol use.

  20. Advanced symbology for general aviation approach to landing displays

    NASA Technical Reports Server (NTRS)

    Bryant, W. H.

    1983-01-01

    A set of flight tests designed to evaluate the relative utility of candidate displays with advanced symbology for general aviation terminal area instrument flight rules operations are discussed. The symbology was previously evaluated as part of the NASA Langley Research Center's Terminal Configured Vehicle Program for use in commercial airlines. The advanced symbology included vehicle track angle, flight path angle and a perspective representation of the runway. These symbols were selectively drawn on a cathode ray tube (CRT) display along with the roll attitude, pitch attitude, localizer deviation and glideslope deviation. In addition to the CRT display, the instrument panel contained standard turn and bank, altimeter, rate of climb, airspeed, heading, and engine instruments. The symbology was evaluated using tracking performance and pilot subjective ratings for an instrument landing system capture and tracking task.

  1. Dimensionless parameterization of lidar for laser remote sensing of the atmosphere and its application to systems with SiPM and PMT detectors.

    PubMed

    Agishev, Ravil; Comerón, Adolfo; Rodriguez, Alejandro; Sicard, Michaël

    2014-05-20

    In this paper, we show a renewed approach to the generalized methodology for atmospheric lidar assessment, which uses the dimensionless parameterization as a core component. It is based on a series of our previous works where the problem of universal parameterization over many lidar technologies were described and analyzed from different points of view. The modernized dimensionless parameterization concept applied to relatively new silicon photomultiplier detectors (SiPMs) and traditional photomultiplier (PMT) detectors for remote-sensing instruments allowed predicting the lidar receiver performance with sky background available. The renewed approach can be widely used to evaluate a broad range of lidar system capabilities for a variety of lidar remote-sensing applications as well as to serve as a basis for selection of appropriate lidar system parameters for a specific application. Such a modernized methodology provides a generalized, uniform, and objective approach for evaluation of a broad range of lidar types and systems (aerosol, Raman, DIAL) operating on different targets (backscatter or topographic) and under intense sky background conditions. It can be used within the lidar community to compare different lidar instruments.

  2. A computer program for condensing heat exchanger performance in the presence of noncondensable gases

    NASA Technical Reports Server (NTRS)

    Yendler, Boris

    1994-01-01

    A computer model has been developed which evaluates the performance of a heat exchanger. This model is general enough to be used to evaluate many heat exchanger geometries and a number of different operating conditions. The film approach is used to describe condensation in the presence of noncondensables. The model is also easily expanded to include other effects like fog formation or suction.

  3. Modeling and Performance Evaluation of Backoff Misbehaving Nodes in CSMA/CA Networks

    DTIC Science & Technology

    2012-08-01

    Modeling and Performance Evaluation of Backoff Misbehaving Nodes in CSMA/CA Networks Zhuo Lu, Student Member, IEEE, Wenye Wang, Senior Member, IEEE... misbehaving nodes can obtain, we define and study two general classes of backoff misbehavior: continuous misbehavior, which keeps manipulating the backoff...misbehavior sporadically. Our approach is to introduce a new performance metric, namely order gain, to characterize the performance benefits of misbehaving

  4. Systematic Planning for Educational Facilities.

    ERIC Educational Resources Information Center

    McGuffey, Carroll W.

    This monograph provides a systematic approach to the problem of planning educational facilities. It first presents a conceptual framework for a general facilities planning and management system called Facilities Resource Allocation Management Evaluation System (FRAMES). The main components of FRAMES are identified as: (1) needs assessment, (2)…

  5. A Ranking-Theoretic Approach to Conditionals

    ERIC Educational Resources Information Center

    Spohn, Wolfgang

    2013-01-01

    Conditionals somehow express conditional beliefs. However, conditional belief is a bi-propositional attitude that is generally not truth-evaluable, in contrast to unconditional belief. Therefore, this article opts for an expressivistic semantics for conditionals, grounds this semantics in the arguably most adequate account of conditional belief,…

  6. NHS-based Tandem Mass Tagging of Proteins at the Level of Whole Cells: A Critical Evaluation in Comparison to Conventional TMT-Labeling Approaches for Quantitative Proteome Analysis.

    PubMed

    Megger, Dominik A; Pott, Leona L; Rosowski, Kristin; Zülch, Birgit; Tautges, Stephanie; Bracht, Thilo; Sitek, Barbara

    2017-01-01

    Tandem mass tags (TMT) are usually introduced at the levels of isolated proteins or peptides. Here, for the first time, we report the labeling of whole cells and a critical evaluation of its performance in comparison to conventional labeling approaches. The obtained results indicated that TMT protein labeling using intact cells is generally possible, if it is coupled to a subsequent enrichment using anti-TMT antibody. The quantitative results were similar to those obtained after labeling of isolated proteins and both were found to be slightly complementary to peptide labeling. Furthermore, when using NHS-based TMT, no specificity towards cell surface proteins was observed in the case of cell labeling. In summary, the conducted study revealed first evidence for the general possibility of TMT cell labeling and highlighted limitations of NHS-based labeling reagents. Future studies should therefore focus on the synthesis and investigation of membrane impermeable TMTs to increase specificity towards cell surface proteins.

  7. Review and evaluation of recent developments in melic inlet dynamic flow distortion prediction and computer program documentation and user's manual estimating maximum instantaneous inlet flow distortion from steady-state total pressure measurements with full, limited, or no dynamic data

    NASA Technical Reports Server (NTRS)

    Schweikhard, W. G.; Dennon, S. R.

    1986-01-01

    A review of the Melick method of inlet flow dynamic distortion prediction by statistical means is provided. These developments include the general Melick approach with full dynamic measurements, a limited dynamic measurement approach, and a turbulence modelling approach which requires no dynamic rms pressure fluctuation measurements. These modifications are evaluated by comparing predicted and measured peak instantaneous distortion levels from provisional inlet data sets. A nonlinear mean-line following vortex model is proposed and evaluated as a potential criterion for improving the peak instantaneous distortion map generated from the conventional linear vortex of the Melick method. The model is simplified to a series of linear vortex segments which lay along the mean line. Maps generated with this new approach are compared with conventionally generated maps, as well as measured peak instantaneous maps. Inlet data sets include subsonic, transonic, and supersonic inlets under various flight conditions.

  8. Review, evaluation, and discussion of the challenges of missing value imputation for mass spectrometry-based label-free global proteomics

    DOE PAGES

    Webb-Robertson, Bobbie-Jo M.; Wiberg, Holli K.; Matzke, Melissa M.; ...

    2015-04-09

    In this review, we apply selected imputation strategies to label-free liquid chromatography–mass spectrometry (LC–MS) proteomics datasets to evaluate the accuracy with respect to metrics of variance and classification. We evaluate several commonly used imputation approaches for individual merits and discuss the caveats of each approach with respect to the example LC–MS proteomics data. In general, local similarity-based approaches, such as the regularized expectation maximization and least-squares adaptive algorithms, yield the best overall performances with respect to metrics of accuracy and robustness. However, no single algorithm consistently outperforms the remaining approaches, and in some cases, performing classification without imputation sometimes yieldedmore » the most accurate classification. Thus, because of the complex mechanisms of missing data in proteomics, which also vary from peptide to protein, no individual method is a single solution for imputation. In summary, on the basis of the observations in this review, the goal for imputation in the field of computational proteomics should be to develop new approaches that work generically for this data type and new strategies to guide users in the selection of the best imputation for their dataset and analysis objectives.« less

  9. Performance evaluation of an automatic MGRF-based lung segmentation approach

    NASA Astrophysics Data System (ADS)

    Soliman, Ahmed; Khalifa, Fahmi; Alansary, Amir; Gimel'farb, Georgy; El-Baz, Ayman

    2013-10-01

    The segmentation of the lung tissues in chest Computed Tomography (CT) images is an important step for developing any Computer-Aided Diagnostic (CAD) system for lung cancer and other pulmonary diseases. In this paper, we introduce a new framework for validating the accuracy of our developed Joint Markov-Gibbs based lung segmentation approach using 3D realistic synthetic phantoms. These phantoms are created using a 3D Generalized Gauss-Markov Random Field (GGMRF) model of voxel intensities with pairwise interaction to model the 3D appearance of the lung tissues. Then, the appearance of the generated 3D phantoms is simulated based on iterative minimization of an energy function that is based on the learned 3D-GGMRF image model. These 3D realistic phantoms can be used to evaluate the performance of any lung segmentation approach. The performance of our segmentation approach is evaluated using three metrics, namely, the Dice Similarity Coefficient (DSC), the modified Hausdorff distance, and the Average Volume Difference (AVD) between our segmentation and the ground truth. Our approach achieves mean values of 0.994±0.003, 8.844±2.495 mm, and 0.784±0.912 mm3, for the DSC, Hausdorff distance, and the AVD, respectively.

  10. The Adolescent Mentalization-based Integrative Treatment (AMBIT) approach to outcome evaluation and manualization: adopting a learning organization approach.

    PubMed

    Fuggle, Peter; Bevington, Dickon; Cracknell, Liz; Hanley, James; Hare, Suzanne; Lincoln, John; Richardson, Garry; Stevens, Nina; Tovey, Heather; Zlotowitz, Sally

    2015-07-01

    AMBIT (Adolescent Mentalization-Based Integrative Treatment) is a developing team approach to working with hard-to-reach adolescents. The approach applies the principle of mentalization to relationships with clients, team relationships and working across agencies. It places a high priority on the need for locally developed evidence-based practice, and proposes that outcome evaluation needs to be explicitly linked with processes of team learning using a learning organization framework. A number of innovative methods of team learning are incorporated into the AMBIT approach, particularly a system of web-based wiki-formatted AMBIT manuals individualized for each participating team. The paper describes early development work of the model and illustrates ways of establishing explicit links between outcome evaluation, team learning and manualization by describing these methods as applied to two AMBIT-trained teams; one team working with young people on the edge of care (AMASS - the Adolescent Multi-Agency Support Service) and another working with substance use (CASUS - Child and Adolescent Substance Use Service in Cambridgeshire). Measurement of the primary outcomes for each team (which were generally very positive) facilitated team learning and adaptations of methods of practice that were consolidated through manualization. © The Author(s) 2014.

  11. Three dimensional PNS solutions of hypersonic internal flows with equilibrium chemistry

    NASA Technical Reports Server (NTRS)

    Liou, May-Fun

    1989-01-01

    An implicit procedure for solving parabolized Navier-Stokes equations under the assumption of a general equation of state for a gas in chemical equilibrium is given. A general and consistent approach for the evaluation of Jacobian matrices in the implicit operator avoids the use of unnecessary auxiliary quantities and approximations, and leads to a simple expression. Applications to two- and three-dimensional flow problems show efficiency in computer time and economy in storage.

  12. Guidelines for composite materials research related to general aviation aircraft

    NASA Technical Reports Server (NTRS)

    Dow, N. F.; Humphreys, E. A.; Rosen, B. W.

    1983-01-01

    Guidelines for research on composite materials directed toward the improvement of all aspects of their applicability for general aviation aircraft were developed from extensive studies of their performance, manufacturability, and cost effectiveness. Specific areas for research and for manufacturing development were identified and evaluated. Inputs developed from visits to manufacturers were used in part to guide these evaluations, particularly in the area of cost effectiveness. Throughout the emphasis was to direct the research toward the requirements of general aviation aircraft, for which relatively low load intensities are encountered, economy of production is a prime requirement, and yet performance still commands a premium. A number of implications regarding further directions for developments in composites to meet these requirements also emerged from the studies. Chief among these is the need for an integrated (computer program) aerodynamic/structures approach to aircraft design.

  13. Integration of the Response Surface Methodology with the Compromise Decision Support Problem in Developing a General Robust Design Procedure

    NASA Technical Reports Server (NTRS)

    Chen, Wei; Tsui, Kwok-Leung; Allen, Janet K.; Mistree, Farrokh

    1994-01-01

    In this paper we introduce a comprehensive and rigorous robust design procedure to overcome some limitations of the current approaches. A comprehensive approach is general enough to model the two major types of robust design applications, namely, robust design associated with the minimization of the deviation of performance caused by the deviation of noise factors (uncontrollable parameters), and robust design due to the minimization of the deviation of performance caused by the deviation of control factors (design variables). We achieve mathematical rigor by using, as a foundation, principles from the design of experiments and optimization. Specifically, we integrate the Response Surface Method (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closed-form solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. Our focus in this paper is on illustrating our approach rather than on the results per se.

  14. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    NASA Astrophysics Data System (ADS)

    Lee, G.; Jun, K. S.; Cung, E. S.

    2014-09-01

    This study proposes an improved group decision making (GDM) framework that combines VIKOR method with fuzzified data to quantify the spatial flood vulnerability including multi-criteria evaluation indicators. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. Triangular fuzzy numbers are used to consider the uncertainty of weights and the crisp data of proxy variables. This approach can effectively propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the south Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the results from general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods, such as those developed by Borda, Condorcet, and Copeland. The evaluated priorities were significantly dependent on the employed decision-making method. The proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  15. A School Finance Computer Simulation Model

    ERIC Educational Resources Information Center

    Boardman, Gerald R.

    1974-01-01

    Presents a description of the computer simulation model developed by the National Educational Finance Project for use by States in planning and evaluating alternative approaches for State support programs. Provides a general introduction to the model, a program operation overview, a sample run, and some conclusions. (Author/WM)

  16. 40 CFR 1502.22 - Incomplete or unavailable information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... approaches or research methods generally accepted in the scientific community. For the purposes of this... credible scientific evidence which is relevant to evaluating the reasonably foreseeable significant adverse... scientific evidence, is not based on pure conjecture, and is within the rule of reason. (c) The amended...

  17. 40 CFR 1502.22 - Incomplete or unavailable information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... approaches or research methods generally accepted in the scientific community. For the purposes of this... credible scientific evidence which is relevant to evaluating the reasonably foreseeable significant adverse... scientific evidence, is not based on pure conjecture, and is within the rule of reason. (c) The amended...

  18. 40 CFR 1502.22 - Incomplete or unavailable information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... approaches or research methods generally accepted in the scientific community. For the purposes of this... credible scientific evidence which is relevant to evaluating the reasonably foreseeable significant adverse... scientific evidence, is not based on pure conjecture, and is within the rule of reason. (c) The amended...

  19. Numerical and Qualitative Contrasts of Two Statistical Models for Water Quality Change in Tidal Waters

    EPA Science Inventory

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...

  20. 40 CFR 1502.22 - Incomplete or unavailable information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... approaches or research methods generally accepted in the scientific community. For the purposes of this... credible scientific evidence which is relevant to evaluating the reasonably foreseeable significant adverse... scientific evidence, is not based on pure conjecture, and is within the rule of reason. (c) The amended...

  1. 40 CFR 1502.22 - Incomplete or unavailable information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... approaches or research methods generally accepted in the scientific community. For the purposes of this... credible scientific evidence which is relevant to evaluating the reasonably foreseeable significant adverse... scientific evidence, is not based on pure conjecture, and is within the rule of reason. (c) The amended...

  2. Determining and Communicating the Value of the Special Library.

    ERIC Educational Resources Information Center

    Matthews, Joseph R.

    2003-01-01

    Discusses performance measures for libraries that will indicate the goodness of the library and its services. Highlights include a general evaluation model that includes input, process, output, and outcome measures; balanced scorecard approach that includes financial perspectives; focusing on strategy; strategies for change; user criteria for…

  3. A General Survey of Qualitative Research Methodology.

    ERIC Educational Resources Information Center

    Cary, Rick

    Current definitions and philosophical foundations of qualitative research are presented; and designs, evaluation methods, and issues in application of qualitative research to education are discussed. The effects of positivism and the post-positivist era on qualitative research are outlined, and naturalist and positivist approaches are contrasted.…

  4. Online Learning Programs: Evaluation's Challenging Future

    ERIC Educational Resources Information Center

    Nord, Derek

    2011-01-01

    With the vast array of contextual factors, pedagogical approaches, models of implementation, and purposes of education and training related to online learning, educators, learners, and the general public alike are seeking answers regarding utility and effectiveness of online learning. This article identifies and responds to many of the challenges…

  5. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    ERIC Educational Resources Information Center

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  6. A new approach to the concept of "relevance" in information retrieval (IR).

    PubMed

    Kagolovsky, Y; Möhr, J R

    2001-01-01

    The concept of "relevance" is the fundamental concept of information science in general and information retrieval, in particular. Although "relevance" is extensively used in evaluation of information retrieval, there are considerable problems associated with reaching an agreement on its definition, meaning, evaluation, and application in information retrieval. There are a number of different views on "relevance" and its use for evaluation. Based on a review of the literature the main problems associated with the concept of "relevance" in information retrieval are identified. The authors argue that the proposal for the solution of the problems can be based on the conceptual IR framework built using a systems analytic approach to IR. Using this framework different kinds of "relevance" relationships in the IR process are identified, and a methodology for evaluation of "relevance" based on methods of semantics capturing and comparison is proposed.

  7. DFTB+ and lanthanides

    NASA Astrophysics Data System (ADS)

    Hourahine, B.; Aradi, B.; Frauenheim, T.

    2010-07-01

    DFTB+ is a recent general purpose implementation of density-functional based tight binding. One of the early motivators to develop this code was to investigate lanthanide impurities in nitride semiconductors, leading to a series of successful studies into structure and electrical properties of these systems. Here we describe our general framework to treat the physical effects needed for these problematic impurities within a tight-binding formalism, additionally discussing forces and stresses in DFTB. We also present an approach to evaluate the general case of Slater-Koster transforms and all of their derivatives in Cartesian coordinates. These developments are illustrated by simulating isolated Gd impurities in GaN.

  8. Determination of transport wind speed in the gaussian plume diffusion equation for low-lying point sources

    NASA Astrophysics Data System (ADS)

    Wang, I. T.

    A general method for determining the effective transport wind speed, overlineu, in the Gaussian plume equation is discussed. Physical arguments are given for using the generalized overlineu instead of the often adopted release-level wind speed with the plume diffusion equation. Simple analytical expressions for overlineu applicable to low-level point releases and a wide range of atmospheric conditions are developed. A non-linear plume kinematic equation is derived using these expressions. Crosswind-integrated SF 6 concentration data from the 1983 PNL tracer experiment are used to evaluate the proposed analytical procedures along with the usual approach of using the release-level wind speed. Results of the evaluation are briefly discussed.

  9. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    NASA Astrophysics Data System (ADS)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  10. Coupling Sensing Hardware with Data Interrogation Software for Structural Health Monitoring

    DOE PAGES

    Farrar, Charles R.; Allen, David W.; Park, Gyuhae; ...

    2006-01-01

    The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1) Operational Evaluation, (2) Data Acquisition and Cleansing, (3) Feature Extraction and Data Compression, and (4) Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discussmore » each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.« less

  11. Discrete mathematics course supported by CAS MATHEMATICA

    NASA Astrophysics Data System (ADS)

    Ivanov, O. A.; Ivanova, V. V.; Saltan, A. A.

    2017-08-01

    In this paper, we discuss examples of assignments for a course in discrete mathematics for undergraduate students majoring in business informatics. We consider several problems with computer-based solutions and discuss general strategies for using computers in teaching mathematics and its applications. In order to evaluate the effectiveness of our approach, we conducted an anonymous survey. The results of the survey provide evidence that our approach contributes to high outcomes and aligns with the course aims and objectives.

  12. Statistical significance of trace evidence matches using independent physicochemical measurements

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George

    1997-02-01

    A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.

  13. Cockpit Technology for Prevention of General Aviation Runway Incursions

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Jones, Denise R.

    2007-01-01

    General aviation accounted for 74 percent of runway incursions but only 57 percent of the operations during the four-year period from fiscal year (FY) 2001 through FY2004. Elements of the NASA Runway Incursion Prevention System were adapted and tested for general aviation aircraft. Sixteen General Aviation pilots, of varying levels of certification and amount of experience, participated in a piloted simulation study to evaluate the system for prevention of general aviation runway incursions compared to existing moving map displays. Pilots flew numerous complex, high workload approaches under varying weather and visibility conditions. A rare-event runway incursion scenario was presented, unbeknownst to the pilots, which represented a typical runway incursion situation. The results validated the efficacy and safety need for a runway incursion prevention system for general aviation aircraft.

  14. Evaluation of evidence-based literature and formulation of recommendations for the clinical preventive guidelines for immigrants and refugees in Canada

    PubMed Central

    Tugwell, Peter; Pottie, Kevin; Welch, Vivian; Ueffing, Erin; Chambers, Andrea; Feightner, John

    2011-01-01

    Background: This article describes the evidence review and guideline development method developed for the Clinical Preventive Guidelines for Immigrants and Refugees in Canada by the Canadian Collaboration for Immigrant and Refugee Health Guideline Committee. Methods: The Appraisal of Guidelines for Research and Evaluation (AGREE) best-practice framework was combined with the recently developed Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to produce evidence-based clinical guidelines for immigrants and refugees in Canada. Results: A systematic approach was designed to produce the evidence reviews and apply the GRADE approach, including building on evidence from previous systematic reviews, searching for and comparing evidence between general and specific immigrant populations, and applying the GRADE criteria for making recommendations. This method was used for priority health conditions that had been selected by practitioners caring for immigrants and refugees in Canada. Interpretation: This article outlines the 14-step method that was defined to standardize the guideline development process for each priority health condition. PMID:20573711

  15. Group theoretical formulation of free fall and projectile motion

    NASA Astrophysics Data System (ADS)

    Düztaş, Koray

    2018-07-01

    In this work we formulate the group theoretical description of free fall and projectile motion. We show that the kinematic equations for constant acceleration form a one parameter group acting on a phase space. We define the group elements ϕ t by their action on the points in the phase space. We also generalize this approach to projectile motion. We evaluate the group orbits regarding their relations to the physical orbits of particles and unphysical solutions. We note that the group theoretical formulation does not apply to more general cases involving a time-dependent acceleration. This method improves our understanding of the constant acceleration problem with its global approach. It is especially beneficial for students who want to pursue a career in theoretical physics.

  16. On mitigating rapid onset natural disasters: Project THRUST (Tsunami Hazards Reduction Utilizing Systems Technology)

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.; Behn, R. R.; Hebenstreit, G. T.; Gonzalez, F. I.; Krumpe, P.; Lander, J. F.; Lorca, E.; McManamon, P. M.; Milburn, H. B.

    Rapid onset natural hazards have claimed more than 2.8 million lives worldwide in the past 20 years. This category includes such events as earthquakes, landslides, hurricanes, tornados, floods, volcanic eruptions, wildfires, and tsunamis. Effective hazard mitigation is particularly difficult in such cases, since the time available to issue warnings can be very short or even nonexistent. This paper presents the concept of a local warning system that exploits and integrates the existing technologies of risk evaluation, environmental measurement, and telecommunications. We describe Project THRUST, a successful implementation of this general, systematic approach to tsunamis. The general approach includes pre-event emergency planning, real-time hazard assessment, and rapid warning via satellite communication links.

  17. Generalized regression neural network (GRNN)-based approach for colored dissolved organic matter (CDOM) retrieval: case study of Connecticut River at Middle Haddam Station, USA.

    PubMed

    Heddam, Salim

    2014-11-01

    The prediction of colored dissolved organic matter (CDOM) using artificial neural network approaches has received little attention in the past few decades. In this study, colored dissolved organic matter (CDOM) was modeled using generalized regression neural network (GRNN) and multiple linear regression (MLR) models as a function of Water temperature (TE), pH, specific conductance (SC), and turbidity (TU). Evaluation of the prediction accuracy of the models is based on the root mean square error (RMSE), mean absolute error (MAE), coefficient of correlation (CC), and Willmott's index of agreement (d). The results indicated that GRNN can be applied successfully for prediction of colored dissolved organic matter (CDOM).

  18. Intelligent fault diagnosis and failure management of flight control actuation systems

    NASA Technical Reports Server (NTRS)

    Bonnice, William F.; Baker, Walter

    1988-01-01

    The real-time fault diagnosis and failure management (FDFM) of current operational and experimental dual tandem aircraft flight control system actuators was investigated. Dual tandem actuators were studied because of the active FDFM capability required to manage the redundancy of these actuators. The FDFM methods used on current dual tandem actuators were determined by examining six specific actuators. The FDFM capability on these six actuators was also evaluated. One approach for improving the FDFM capability on dual tandem actuators may be through the application of artificial intelligence (AI) technology. Existing AI approaches and applications of FDFM were examined and evaluated. Based on the general survey of AI FDFM approaches, the potential role of AI technology for real-time actuator FDFM was determined. Finally, FDFM and maintainability improvements for dual tandem actuators were recommended.

  19. Advanced Multiple Processor Configuration Study. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    This summary of a study on multiple processor configurations includes the objectives, background, approach, and results of research undertaken to provide the Air Force with a generalized model of computer processor combinations for use in the evaluation of proposed flight training simulator computational designs. An analysis of a real-time flight…

  20. Pilot Study: Unit on White Racism.

    ERIC Educational Resources Information Center

    Hurwitz, Alan; Snook, Valerie

    This report is an attempt to explore approaches in which white people examine their own racism, understand its nature and its consequences, and then plan self-directed changes in the direction of increasingly anti-racist behavior. In the pilot study described and evaluated in the report, three general purposes indicated were: assisting…

  1. Adolescent Girls' Cognitive Appraisals of Coping Responses to Sexual Harassment

    ERIC Educational Resources Information Center

    Leaper, Campbell; Brown, Christia Spears; Ayres, Melanie M.

    2013-01-01

    Peer sexual harassment is a stressor for many girls in middle and high school. Prior research indicates that approach strategies (seeking support or confronting) are generally more effective than avoidance strategies in alleviating stress. However, the deployment of effective coping behaviors depends partly on how individuals evaluate different…

  2. BEHAVIOR OF MEALWORMS, TEACHER'S GUIDE.

    ERIC Educational Resources Information Center

    Elementary Science Study, Newton, MA.

    THIS TEACHER'S GUIDE IS DESIGNED FOR USE WITH AN ELEMENTARY SCIENCE STUDY UNIT, THE "BEHAVIOR OF MEALWORMS." BY MAKING CAREFUL OBSERVATIONS AND PERFORMING SIMPLE EXPERIMENTS, THE CHILDREN LEARN HOW TO APPROACH A PROBLEM, HOW TO INTERPRET AND EVALUATE DATA, AND, IN GENERAL, HOW TO CONDUCT A SCIENTIFIC INVESTIGATION. THE MATERIALS HAVE…

  3. End of Project Report, 1967-1970.

    ERIC Educational Resources Information Center

    Metropolitan Educational Center in the Arts, St. Louis, MO.

    This Metropolitan Educational Center in the Arts (MECA) project report details activities directed toward exploration, implementation, and evaluation of new and exemplary approaches to education in the arts in the City of St. Louis and the surrounding five-county region. The MECA projects described fall generally into three categories: 1) Projects…

  4. 48 CFR 970.1504-1-5 - General considerations and techniques for determining fixed fees.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) The Department's fee policy recognizes that fee is remuneration to contractors for the entrepreneurial... and amounts for DOE management and operating contracts is inappropriate considering the limited level.... Instead of being solely cost-based, the desirable approach calls for a structure that allows evaluation of...

  5. 48 CFR 970.1504-1-5 - General considerations and techniques for determining fixed fees.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) The Department's fee policy recognizes that fee is remuneration to contractors for the entrepreneurial... and amounts for DOE management and operating contracts is inappropriate considering the limited level.... Instead of being solely cost-based, the desirable approach calls for a structure that allows evaluation of...

  6. An Evaluation of an Automated Approach to Concept-Based Grammar Instruction

    ERIC Educational Resources Information Center

    Lyddon, Paul A.

    2012-01-01

    Acquiring sufficient linguistic proficiency to perform competently in academic and professional contexts generally requires substantial study time beyond what most language programs can offer in the classroom. As such, teachers and students alike would benefit considerably from high quality self-access materials promoting independent learning out…

  7. Teaching Information Literacy and Scientific Process Skills: An Integrated Approach.

    ERIC Educational Resources Information Center

    Souchek, Russell; Meier, Marjorie

    1997-01-01

    Describes an online searching and scientific process component taught as part of the laboratory for a general zoology course. The activities were designed to be gradually more challenging, culminating in a student-developed final research project. Student evaluations were positive, and faculty indicated that student research skills transferred to…

  8. Approaches to the Analysis of School Costs, an Introduction.

    ERIC Educational Resources Information Center

    Payzant, Thomas

    A review and general discussion of quantitative and qualitative techniques for the analysis of economic problems outside of education is presented to help educators discover new tools for planning, allocating, and evaluating educational resources. The pamphlet covers some major components of cost accounting, cost effectiveness, cost-benefit…

  9. Developing Better Environmental Assessment and Protection Strategies: A Case Example on Early Detection Monitoring for Aquatic Invasive Species

    EPA Science Inventory

    A principal theme of our research group is to develop, evaluate, and improve monitoring approaches, ecological assessments, and environmental protection strategies. Over the past decade, we have conducted a number of studies under this general theme, across the Great Lakes basin...

  10. Abnormalities in early visual processes are linked to hypersociability and atypical evaluation of facial trustworthiness: An ERP study with Williams syndrome.

    PubMed

    Shore, Danielle M; Ng, Rowena; Bellugi, Ursula; Mills, Debra L

    2017-10-01

    Accurate assessment of trustworthiness is fundamental to successful and adaptive social behavior. Initially, people assess trustworthiness from facial appearance alone. These assessments then inform critical approach or avoid decisions. Individuals with Williams syndrome (WS) exhibit a heightened social drive, especially toward strangers. This study investigated the temporal dynamics of facial trustworthiness evaluation in neurotypic adults (TD) and individuals with WS. We examined whether differences in neural activity during trustworthiness evaluation may explain increased approach motivation in WS compared to TD individuals. Event-related potentials were recorded while participants appraised faces previously rated as trustworthy or untrustworthy. TD participants showed increased sensitivity to untrustworthy faces within the first 65-90 ms, indexed by the negative-going rise of the P1 onset (oP1). The amplitude of the oP1 difference to untrustworthy minus trustworthy faces was correlated with lower approachability scores. In contrast, participants with WS showed increased N170 amplitudes to trustworthy faces. The N170 difference to low-high-trust faces was correlated with low approachability in TD and high approachability in WS. The findings suggest that hypersociability associated with WS may arise from abnormalities in the timing and organization of early visual brain activity during trustworthiness evaluation. More generally, the study provides support for the hypothesis that impairments in low-level perceptual processes can have a cascading effect on social cognition.

  11. A head-up display for low-visibility approach and landing

    NASA Technical Reports Server (NTRS)

    Bray, R. S.; Scott, B. C.

    1981-01-01

    An electronic flight-guidance display format was designed for use in evaluations of the collimated head-up display concept in low-visibility landings of transport aircraft. In the design process of iterative evaluation and modification, some general principles, or guidelines, applicable to such flight displays were suggested. The usefulness of an indication of instantaneous inertial flightpath was clearly demonstrated, particularly in low-altitude transition to visual references. Evaluator pilot acceptance of the unfamiliar display concepts was very positive when careful attention was given to indoctrination and training.

  12. Sensitivity Analysis of ProSEDS (Propulsive Small Expendable Deployer System) Data Communication System

    NASA Technical Reports Server (NTRS)

    Park, Nohpill; Reagan, Shawn; Franks, Greg; Jones, William G.

    1999-01-01

    This paper discusses analytical approaches to evaluating performance of Spacecraft On-Board Computing systems, thereby ultimately achieving a reliable spacecraft data communications systems. The sensitivity analysis approach of memory system on the ProSEDS (Propulsive Small Expendable Deployer System) as a part of its data communication system will be investigated. Also, general issues and possible approaches to reliable Spacecraft On-Board Interconnection Network and Processor Array will be shown. The performance issues of a spacecraft on-board computing systems such as sensitivity, throughput, delay and reliability will be introduced and discussed.

  13. It depends: Approach and avoidance reactions to emotional expressions are influenced by the contrast emotions presented in the task.

    PubMed

    Paulus, Andrea; Wentura, Dirk

    2016-02-01

    Studies examining approach and avoidance reactions to emotional expressions have yielded conflicting results. For example, expressions of anger have been reported to elicit approach reactions in some studies but avoidance reactions in others. Nonetheless, the results were often explained by the same general underlying process, namely the influence that the social message signaled by the expression has on motivational responses. It is therefore unclear which reaction is triggered by which emotional expression, and which underlying process is responsible for these reactions. In order to address this issue, we examined the role of a potential moderator on approach and avoidance reactions to emotional expressions, namely the contrast emotion used in the task. We believe that different approach and avoidance reactions occur depending on the congruency or incongruency of the evaluation of the 2 emotions presented in the task. The results from a series of experiments supported these assumptions: Negative emotional expressions (anger, fear, sadness) elicited avoidance reactions if contrasted with expressions of happiness. However, if contrasted with a different negative emotional expression, anger and sadness triggered approach reactions and fear activated avoidance reactions. Importantly, these results also emerged if the emotional expression was not task-relevant. We propose that approach and avoidance reactions to emotional expressions are triggered by their evaluation if the 2 emotions presented in a task differ in evaluative connotation. If they have the same evaluative connotation, however, reactions are determined by their social message. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Instructor and student pilots' subjective evaluation of a general aviation simulator with a terrain visual system

    NASA Technical Reports Server (NTRS)

    Kiteley, G. W.; Harris, R. L., Sr.

    1978-01-01

    Ten student pilots were given a 1 hour training session in the NASA Langley Research Center's General Aviation Simulator by a certified flight instructor and a follow-up flight evaluation was performed by the student's own flight instructor, who has also flown the simulator. The students and instructors generally felt that the simulator session had a positive effect on the students. They recommended that a simulator with a visual scene and a motion base would be useful in performing such maneuvers as: landing approaches, level flight, climbs, dives, turns, instrument work, and radio navigation, recommending that the simulator would be an efficient means of introducing the student to new maneuvers before doing them in flight. The students and instructors estimated that about 8 hours of simulator time could be profitably devoted to the private pilot training.

  15. [Causality link in criminal law: role of epidemiology].

    PubMed

    Zocchetti, C; Riboldi, L

    2003-01-01

    This paper focusses on the role of epidemiology in demonstrating causality in criminal trials of toxic tort litigation. First of all, consideration is given of the specificity of the criminal trial and of the role of the epidemiologist as expert witness. As a second step the concept of causality is examined separating general from specific (individual level) causality. As regards general causality, strategies based on some criteria (example: Bradford-Hill criteria) are contrasted with approaches that do not consider causality a matter of science but one of health policy; and specific methods frequently used (meta-analysis, risk assessment, International Boards evaluation,....) are discussed, with special reference to the adoption of high-level standards and to the context of cross-examination. As regards individual level causality the difficulties of the epidemiologic approach to such evaluation are stressed, with special reference to topics like expected value, attributable risk, and probability of causation. All examples are taken from Italian court trials. A general comment on the difficulties of using the criminal trial (dominated by the "but for" rule) for toxic tort litigation and on the opportunity to switch to trials (civil, administrative) with less stringent causal rules ("more probable than not") is offered, with a consideration also of what are called "class actions".

  16. Temporal Planning for Compilation of Quantum Approximate Optimization Algorithm Circuits

    NASA Technical Reports Server (NTRS)

    Venturelli, Davide; Do, Minh Binh; Rieffel, Eleanor Gilbert; Frank, Jeremy David

    2017-01-01

    We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus our initial experiments on Quantum Approximate Optimization Algorithm (QAOA) circuits that have few ordering constraints and allow highly parallel plans. We report on experiments using several temporal planners to compile circuits of various sizes to a realistic hardware. This early empirical evaluation suggests that temporal planning is a viable approach to quantum circuit compilation.

  17. Impact resistance of fiber composites - Energy-absorbing mechanisms and environmental effects

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.

    1985-01-01

    Energy absorbing mechanisms were identified by several approaches. The energy absorbing mechanisms considered are those in unidirectional composite beams subjected to impact. The approaches used include: mechanic models, statistical models, transient finite element analysis, and simple beam theory. Predicted results are correlated with experimental data from Charpy impact tests. The environmental effects on impact resistance are evaluated. Working definitions for energy absorbing and energy releasing mechanisms are proposed and a dynamic fracture progression is outlined. Possible generalizations to angle-plied laminates are described.

  18. Impact resistance of fiber composites: Energy absorbing mechanisms and environmental effects

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.

    1983-01-01

    Energy absorbing mechanisms were identified by several approaches. The energy absorbing mechanisms considered are those in unidirectional composite beams subjected to impact. The approaches used include: mechanic models, statistical models, transient finite element analysis, and simple beam theory. Predicted results are correlated with experimental data from Charpy impact tests. The environmental effects on impact resistance are evaluated. Working definitions for energy absorbing and energy releasing mechanisms are proposed and a dynamic fracture progression is outlined. Possible generalizations to angle-plied laminates are described.

  19. Survey of air cargo forecasting techniques

    NASA Technical Reports Server (NTRS)

    Kuhlthan, A. R.; Vermuri, R. S.

    1978-01-01

    Forecasting techniques currently in use in estimating or predicting the demand for air cargo in various markets are discussed with emphasis on the fundamentals of the different forecasting approaches. References to specific studies are cited when appropriate. The effectiveness of current methods is evaluated and several prospects for future activities or approaches are suggested. Appendices contain summary type analyses of about 50 specific publications on forecasting, and selected bibliographies on air cargo forecasting, air passenger demand forecasting, and general demand and modalsplit modeling.

  20. Practical considerations in the measurement of the internal relative humidity of pharmaceutical packages with data loggers.

    PubMed

    Nelson, Eric D; Huang, Henry

    2011-03-01

    The utility of temperature/humidity data loggers are evaluated as a low-cost approach to enrich practical understanding of the actual time dependent humidity that a pharmaceutical product is exposed to. While this approach is found to have significant utility in general, small systematic biases in the measurements due to the presence of the data logger are observed. Taking these biases into account enables more productive extrapolation of measured time/humidity profiles. © 2011 American Association of Pharmaceutical Scientists

  1. Evaluation of an Inverse Molecular Design Algorithm in a Model Binding Site

    PubMed Central

    Huggins, David J.; Altman, Michael D.; Tidor, Bruce

    2008-01-01

    Computational molecular design is a useful tool in modern drug discovery. Virtual screening is an approach that docks and then scores individual members of compound libraries. In contrast to this forward approach, inverse approaches construct compounds from fragments, such that the computed affinity, or a combination of relevant properties, is optimized. We have recently developed a new inverse approach to drug design based on the dead-end elimination and A* algorithms employing a physical potential function. This approach has been applied to combinatorially constructed libraries of small-molecule ligands to design high-affinity HIV-1 protease inhibitors [M. D. Altman et al. J. Am. Chem. Soc. 130: 6099–6013, 2008]. Here we have evaluated the new method using the well studied W191G mutant of cytochrome c peroxidase. This mutant possesses a charged binding pocket and has been used to evaluate other design approaches. The results show that overall the new inverse approach does an excellent job of separating binders from non-binders. For a few individual cases, scoring inaccuracies led to false positives. The majority of these involve erroneous solvation energy estimation for charged amines, anilinium ions and phenols, which has been observed previously for a variety of scoring algorithms. Interestingly, although inverse approaches are generally expected to identify some but not all binders in a library, due to limited conformational searching, these results show excellent coverage of the known binders while still showing strong discrimination of the non-binders. PMID:18831031

  2. Evaluation of an inverse molecular design algorithm in a model binding site.

    PubMed

    Huggins, David J; Altman, Michael D; Tidor, Bruce

    2009-04-01

    Computational molecular design is a useful tool in modern drug discovery. Virtual screening is an approach that docks and then scores individual members of compound libraries. In contrast to this forward approach, inverse approaches construct compounds from fragments, such that the computed affinity, or a combination of relevant properties, is optimized. We have recently developed a new inverse approach to drug design based on the dead-end elimination and A* algorithms employing a physical potential function. This approach has been applied to combinatorially constructed libraries of small-molecule ligands to design high-affinity HIV-1 protease inhibitors (Altman et al., J Am Chem Soc 2008;130:6099-6013). Here we have evaluated the new method using the well-studied W191G mutant of cytochrome c peroxidase. This mutant possesses a charged binding pocket and has been used to evaluate other design approaches. The results show that overall the new inverse approach does an excellent job of separating binders from nonbinders. For a few individual cases, scoring inaccuracies led to false positives. The majority of these involve erroneous solvation energy estimation for charged amines, anilinium ions, and phenols, which has been observed previously for a variety of scoring algorithms. Interestingly, although inverse approaches are generally expected to identify some but not all binders in a library, due to limited conformational searching, these results show excellent coverage of the known binders while still showing strong discrimination of the nonbinders. (c) 2008 Wiley-Liss, Inc.

  3. Evaluation methodology for query-based scene understanding systems

    NASA Astrophysics Data System (ADS)

    Huster, Todd P.; Ross, Timothy D.; Culbertson, Jared L.

    2015-05-01

    In this paper, we are proposing a method for the principled evaluation of scene understanding systems in a query-based framework. We can think of a query-based scene understanding system as a generalization of typical sensor exploitation systems where instead of performing a narrowly defined task (e.g., detect, track, classify, etc.), the system can perform general user-defined tasks specified in a query language. Examples of this type of system have been developed as part of DARPA's Mathematics of Sensing, Exploitation, and Execution (MSEE) program. There is a body of literature on the evaluation of typical sensor exploitation systems, but the open-ended nature of the query interface introduces new aspects to the evaluation problem that have not been widely considered before. In this paper, we state the evaluation problem and propose an approach to efficiently learn about the quality of the system under test. We consider the objective of the evaluation to be to build a performance model of the system under test, and we rely on the principles of Bayesian experiment design to help construct and select optimal queries for learning about the parameters of that model.

  4. Determining Electrical Properties Based on B1 Fields Measured in an MR Scanner Using a Multi-channel Transmit/Receive Coil: a General Approach

    PubMed Central

    Liu, Jiaen; Zhang, Xiaotong; Van de Moortele, Pierre-Francois; Schmitter, Sebastian

    2013-01-01

    Electrical Property Tomography (EPT) is a recently developed noninvasive technology to image the electrical conductivity and permittivity of biological tissues at Larmor frequency in Magnetic Resonance (MR) scanners. The absolute phase of the complex radio-frequency (RF) magnetic field (B1) is necessary for electrical property calculation. However, due to the lack of practical methods to directly measure the absolute B1 phases, current EPT techniques have been achieved with B1 phase estimation based on certain assumptions on object anatomy, coil structure and/or electromagnetic wave behavior associated with the main magnetic field, limiting EPT from a larger variety of applications. In this study, using a multi-channel transmit/receive coil, the framework of a new general approach for EPT has been introduced, which is independent on the assumptions utilized in previous studies. Using a human head model with realistic geometry, a series of computer simulations at 7T were conducted to evaluate the proposed method under different noise levels. Results showed that the proposed method can be used to reconstruct the conductivity and permittivity images with noticeable accuracy and stability. The feasibility of this approach was further evaluated in a phantom experiment at 7T. PMID:23743673

  5. The elusive importance effect: more failure for the Jamesian perspective on the importance of importance in shaping self-esteem.

    PubMed

    Marsh, Herbert W

    2008-10-01

    Following William James (1890/1963), many leading self-esteem researchers continue to support the Individual-importance hypothesis-that the relation between specific facets of self-concept and global self-esteem depends on the importance an individual places on each specific facet. However, empirical support for the hypothesis is surprisingly elusive, whether evaluated in terms of an importance-weighted average model, a generalized multiple regression approach for testing self-concept-by-importance interactions, or idiographic approaches. How can actual empirical support for such an intuitively appealing and widely cited psychological principle be so elusive? Hardy and Moriarty (2006), acknowledging this previous failure of the Individual-importance hypothesis, claim to have solved the conundrum, demonstrating an innovative idiographic approach that provides clear support for it. However, a critical evaluation of their new approach, coupled with a reanalysis of their data, undermines their claims. Indeed, their data provide compelling support against the Individual-importance hypothesis, which remains as elusive as ever.

  6. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    PubMed

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  7. Evaluation of Site Amplification Functions For Generalized Soil Types Using Earthquake Records and Spectral Models

    NASA Astrophysics Data System (ADS)

    Sokolov, V.; Loh, C. H.; Wen, K. L.

    When evaluating the local site influence on seismic ground motion, in certain cases (e.g. building codes provisions) it is sufficient to describe the variety of soil condi- tions by a few number of generalized site classes. The site classification system that is widely used at present is based on on the properties of top 30 m of soil column, dis- regarding the characteristics of the deeper geology. Six site categories are defined on the basis of averaged shear waves velocity, namely: A - hard rock; B - rock; C - very dense or stiff soil; D - stiff soil; E - soft soil; F - soils requiring special studies. The generalized site amplification curves were developed for several site classes in west- ern US (Boore and Joyner, 1997) and Greece (Klimis et al., 1999) using available geotechnical data from near-surface boreholes. We propose to evaluate the amplifica- tion functions as the ratios between the spectra of real earthquakes recordings and the spectra modeled for "very hard rock" (VHR) conditions. The VHR spectra (regional source scaling and attenuation models) are constructed on the basis of ground motion records. The approach allows, on the one hand, to analyze all obtained records. On the other hand, it is possible to test applicability of the used spectral model. Moreover, the uncertainty of site response may be evaluated and described in terms of random variable characteristics to be considered in seismic hazard analysis. The results of the approach application are demonstrated for the case of Taiwan region. The char- acteristics of site amplification functions (mean values and standard deviation) were determined and analyzed in frequency range of 0.2-13 Hz for site classes B and C us- ing recordings of the 1999 Chi-Chi, Taiwan, earthquake (M=7.6), strong aftershocks (M=6.8), and several earthquakes (M < 6.5) occurred in the region in 1995-1998. When comparing the empirical amplification function resulting from the Taiwan data with that proposed for western US, it has been shown that, for both class B and class C, the US amplification functions exhibit lower values than Taiwan class B function for frequencies 1-8 Hz. The Hellenic class C amplification shows, in general, the sim- ilar shape and amplitude as that evaluated for Taiwan region. Thus, the generalized site amplification curves should be also considered as region-dependent functions.

  8. Shearography for Non-Destructive Evaluation with Applications to BAT Mask Tile Adhesive Bonding and Specular Surface Honeycomb Panels

    NASA Technical Reports Server (NTRS)

    Lysak, Daniel B.

    2003-01-01

    In this report we examine the applicability of shearography techniques for nondestructive inspection and evaluation in two unique application areas. In the first application, shearography is used to evaluate the quality of adhesive bonds holding lead tiles to the BAT gamma ray mask for the NASA Swift program. By exciting the mask with a vibration, the more poorly bonded tiles can be distinguished by their greater displacement response, which is readily identifiable in the shearography image. A quantitative analysis is presented that compares the shearography results with a destructive pull test measuring the force at bond failure. Generally speaking, the results show good agreement. Further investigation would be useful to optimize certain test parameters such as vibration frequency and amplitude. The second application is to evaluate the bonding between the skin and core of a honeycomb structure with a specular (mirror-like) surface. In standard shearography techniques, the object under test must have a diffuse surface to generate the speckle patterns in laser light, which are then sheared. A novel configuration using the specular surface as a mirror to image speckles from a diffuser is presented, opening up the use of shearography to a new class of objects that could not have been examined with the traditional approach. This new technique readily identifies large scale bond failures in the panel, demonstrating the validity of this approach. For the particular panel examined here, some scaling issues should be examined further to resolve the measurement scale down to the very small size of the core cells. In addition, further development should be undertaken to determine the general applicability of the new approach and to establish a firm quantitative foundation.

  9. An approach for integrating toxicogenomic data in risk assessment: The dibutyl phthalate case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Euling, Susan Y., E-mail: euling.susan@epa.gov; Thompson, Chad M.; Chiu, Weihsueh A.

    An approach for evaluating and integrating genomic data in chemical risk assessment was developed based on the lessons learned from performing a case study for the chemical dibutyl phthalate. A case study prototype approach was first developed in accordance with EPA guidance and recommendations of the scientific community. Dibutyl phthalate (DBP) was selected for the case study exercise. The scoping phase of the dibutyl phthalate case study was conducted by considering the available DBP genomic data, taken together with the entire data set, for whether they could inform various risk assessment aspects, such as toxicodynamics, toxicokinetics, and dose–response. A descriptionmore » of weighing the available dibutyl phthalate data set for utility in risk assessment provides an example for considering genomic data for future chemical assessments. As a result of conducting the scoping process, two questions—Do the DBP toxicogenomic data inform 1) the mechanisms or modes of action?, and 2) the interspecies differences in toxicodynamics?—were selected to focus the case study exercise. Principles of the general approach include considering the genomics data in conjunction with all other data to determine their ability to inform the various qualitative and/or quantitative aspects of risk assessment, and evaluating the relationship between the available genomic and toxicity outcome data with respect to study comparability and phenotypic anchoring. Based on experience from the DBP case study, recommendations and a general approach for integrating genomic data in chemical assessment were developed to advance the broader effort to utilize 21st century data in risk assessment. - Highlights: • Performed DBP case study for integrating genomic data in risk assessment • Present approach for considering genomic data in chemical risk assessment • Present recommendations for use of genomic data in chemical risk assessment.« less

  10. An information system design for watershed-wide modeling of water loss to the atmosphere using remote sensing techniques

    NASA Technical Reports Server (NTRS)

    Khorram, S.

    1977-01-01

    Results are presented of a study intended to develop a general location-specific remote-sensing procedure for watershed-wide estimation of water loss to the atmosphere by evaporation and transpiration. The general approach involves a stepwise sequence of required information definition (input data), appropriate sample design, mathematical modeling, and evaluation of results. More specifically, the remote sensing-aided system developed to evaluate evapotranspiration employs a basic two-stage two-phase sample of three information resolution levels. Based on the discussed design, documentation, and feasibility analysis to yield timely, relatively accurate, and cost-effective evapotranspiration estimates on a watershed or subwatershed basis, work is now proceeding to implement this remote sensing-aided system.

  11. A school-based interdisciplinary approach to promote health and academic achievement among children in a deprived neighborhood: study protocol for a mixed-methods evaluation.

    PubMed

    Abrahamse, Mariëlle E; Jonkman, Caroline S; Harting, Janneke

    2018-04-10

    The large number of children that grow up in poverty is concerning, especially given the negative developmental outcomes that can persist into adulthood. Poverty has been found as a risk factor to negatively affect academic achievement and health outcomes in children. Interdisciplinary interventions can be an effective way to promote health and academic achievement. The present study aims to evaluate a school-based interdisciplinary approach on child health, poverty, and academic achievement using a mixed-method design. Generally taken, outcomes of this study increase the knowledge about effective ways to give disadvantaged children equal chances early in their lives. An observational study with a mixed-methods design including both quantitative and qualitative data collection methods will be used to evaluate the interdisciplinary approach. The overall research project exists of three study parts including a longitudinal study, a cross-sectional study, and a process evaluation. Using a multi-source approach we will assess child health as the primary outcome. Child poverty and child academic achievement will be assessed as secondary outcomes. The process evaluation will observe the program's effects on the school environment and the program's implementation in order to obtain more knowledge on how to disseminate the interdisciplinary approach to other schools and neighborhoods. The implementation of a school-based interdisciplinary approach via primary schools combining the cross-sectoral domains health, poverty, and academic achievement is innovative and a step forward to reach an ethnic minority population. However, the large variety of the interventions and activities within the approach can limit the validity of the study. Including a process evaluation will therefore help to improve the interpretation of our findings. In order to contribute to policy and practice focusing on decreasing the unequal chances of children growing up in deprived neighborhoods, it is important to study whether the intervention leads to positive developmental outcomes in children. ( NTR 6571 ) (retrospectively registered on August 4, 2017).

  12. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, Antoine; Edwards, T.C.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  13. Disease management in Latinos with schizophrenia: a family-assisted, skills training approach.

    PubMed

    Kopelowicz, Alex; Zarate, Roberto; Gonzalez Smith, Veronica; Mintz, Jim; Liberman, Robert Paul

    2003-01-01

    This study evaluated the effectiveness of a skills training program designed to teach disease management to Latinos with schizophrenia treated at a community mental health center. Ninety-two Latino outpatients with schizophrenia and their designated relatives were randomly assigned to 3 months of skills training (ST) versus customary outpatient care (CC) and followed for a total of 9 months. The skills training approach was culturally adapted mainly by including the active participation of key relatives to facilitate acquisition and generalization of disease management skills into the patients' natural environment. There was a significant advantage for the ST group over the CC group on several symptom measures, skill acquisition and generalization, level of functioning, and rates of rehospitalization. There were no significant differences between the groups on quality of life or caregiver burden. Skills training had a direct effect on skill acquisition and generalization, and utilization of disease management skills led to decreased rates of rehospitalization. Incorporating an intensive, culturally relevant generalization effort into skills training for Latinos with schizophrenia appeared to be effective in teaching disease management and viable in a community mental health center.

  14. A new method for calculating differential distributions directly in Mellin space

    NASA Astrophysics Data System (ADS)

    Mitov, Alexander

    2006-12-01

    We present a new method for the calculation of differential distributions directly in Mellin space without recourse to the usual momentum-fraction (or z-) space. The method is completely general and can be applied to any process. It is based on solving the integration-by-parts identities when one of the powers of the propagators is an abstract number. The method retains the full dependence on the Mellin variable and can be implemented in any program for solving the IBP identities based on algebraic elimination, like Laporta. General features of the method are: (1) faster reduction, (2) smaller number of master integrals compared to the usual z-space approach and (3) the master integrals satisfy difference instead of differential equations. This approach generalizes previous results related to fully inclusive observables like the recently calculated three-loop space-like anomalous dimensions and coefficient functions in inclusive DIS to more general processes requiring separate treatment of the various physical cuts. Many possible applications of this method exist, the most notable being the direct evaluation of the three-loop time-like splitting functions in QCD.

  15. Generalized Stoner-Wohlfarth model accurately describing the switching processes in pseudo-single ferromagnetic particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cimpoesu, Dorin, E-mail: cdorin@uaic.ro; Stoleriu, Laurentiu; Stancu, Alexandru

    2013-12-14

    We propose a generalized Stoner-Wohlfarth (SW) type model to describe various experimentally observed angular dependencies of the switching field in non-single-domain magnetic particles. Because the nonuniform magnetic states are generally characterized by complicated spin configurations with no simple analytical description, we maintain the macrospin hypothesis and we phenomenologically include the effects of nonuniformities only in the anisotropy energy, preserving as much as possible the elegance of SW model, the concept of critical curve and its geometric interpretation. We compare the results obtained with our model with full micromagnetic simulations in order to evaluate the performance and limits of our approach.

  16. [Correlation between iridology and general pathology].

    PubMed

    Demea, Sorina

    2002-01-01

    The research proposal is to evaluate the association between certain irian signs and general pathology of studied patients. There were studied 57 hospitalized patients; there was taken over all their iris images, which were analyzed through iridological protocols; in the same time the pathology of these patients was noted from their records in the hospital, concordant with the clinical diagnosis; all these information were included in a database for a computerised processing. The correlations resulted from, shows a high connection between the irian constitution establish through iridological criteria and the existent pathology. Iris examination can be very useful for diagnosis of a certain general pathology, in a holistic approach of the patient.

  17. Workplace Literacy Programs: Variations of Approach and Limits of Impact.

    ERIC Educational Resources Information Center

    Mikulecky, Larry

    Six workplace literacy programs were evaluated for impact upon learners, learners' families, and learners' productivity. Site 1 was an automotive plant where learners were involved in technical preparation, the General Educational Development program, and English as a Second Language (ESL). Site 2 was a wood-processing plant with a communication…

  18. 48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...

  19. 48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...

  20. 48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...

  1. 48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...

  2. 48 CFR 1515.404-471 - EPA structured approach for developing profit or fee objectives.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... profit or fee objective. (5) The weight factors discussed in this section are designed for arriving at... involving creative design. (B) Consideration should be given to the managerial and technical efforts.../technical and general labor. Analysis of labor should include evaluation of the comparative quality and...

  3. Population Education: A Source Book on Content and Methodology.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and Oceania.

    A collection of 12 essays provides an overview of population education in Asia and Oceania with regard to concepts, status, approaches in curriculum and materials development, methodologies, and research and evaluation. The collection is presented in five sections. Section I explores general definitions of population education; its role as part of…

  4. A Teacher's Guide to Resource - Use Outdoor Education Center.

    ERIC Educational Resources Information Center

    Taylor County Board of Public Instruction, Perry, FL.

    Included in this guide are ideas on evaluation and conduct of outdoor education classes and unit guides and lesson plans. An interdisciplinary approach is emphasized and the unit guides and lesson plans present activities related to science, mathematics, social studies, art, and music relevant to outdoor education in general. Unit guides for soil,…

  5. Assessing the cost of fuel reduction treatments: a critical review

    Treesearch

    Bob Rummer

    2008-01-01

    The basic costs of the operations for implementing fuel reduction treatments are used to evaluate treatment effectiveness, select among alternatives, estimate total project costs, and build national program strategies. However, a review of the literature indicates that there is questionable basis for many of the general estimates used to date. Different approaches to...

  6. 76 FR 79708 - Draft Environmental Impact Statement/General Management Plan, Golden Gate National Recreation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ...). The Plan/DEIS evaluates four alternatives for updating the current approach to management in Golden.... In recognition of the complexity of the proposed plan alternatives, and with deference to interest... DEPARTMENT OF THE INTERIOR National Park Service [NPS-PWR-PWRO-1108-8862; 2031-A038-409] Draft...

  7. Analysing Language Education Policy for China's Minority Groups in Its Entirety

    ERIC Educational Resources Information Center

    Feng, Anwei; Sunuodula, Mamtimyn

    2009-01-01

    Two main bodies of literature are identifiable in minority education policy studies in China. Many adopt a descriptive approach to examining policy documents and general outcomes in their historical contexts while others focus on evaluating preferential policies made to address inequality issues in minority education. In most discussions,…

  8. Analysis of Student Responses to Peer-Instruction Conceptual Questions Answered Using an Electronic Response System: Trends by Gender and Ethnicity

    ERIC Educational Resources Information Center

    Steer, David; McConnell, David; Gray, Kyle; Kortz, Karen; Liang, Xin

    2009-01-01

    This descriptive study investigated students' answers to geoscience conceptual questions answered using electronic personal response systems. Answer patterns were examined to evaluate the peer-instruction pedagogical approach in a large general education classroom setting. (Contains 3 figures and 2 tables.)

  9. Generalizing Word Lattice Translation

    DTIC Science & Technology

    2008-02-01

    demonstrate substantial gains for Chinese-English and Arabic -English translation. Keywords: word lattice translation, phrase-based and hierarchical...introduce in reordering models. Our experiments evaluating the approach demonstrate substantial gains for Chinese-English and Arabic -English translation. 15...gains for Chinese-English and Arabic -English translation. 1 Introduction When Brown and colleagues introduced statistical machine translation in the

  10. Measurement of Workforce Readiness Competencies: Design of Prototype Measures.

    ERIC Educational Resources Information Center

    O'Neil, Harold F., Jr.; And Others

    A general methodology approach is suggested for measurement of workforce readiness competencies in the context of overall work by the National Center for Research on Evaluation, Standards, and Student Testing on the domain-independent measurement of workforce readiness skills. The methodology consists of 14 steps, from the initial selection of a…

  11. A Market Segmentation Approach to the Study of the Daily Newspaper Audience.

    ERIC Educational Resources Information Center

    Larkin, Ernest F.; Grotta, Gerald L.

    To determine whether differing attitudes toward, and the utilization of, the daily newspaper are related to the variable of age, 481 persons responded to a questionnaire designed to measure their attitudes and opinions about mass media in general and their evaluations of newspaper content in particular. The findings revealed the following…

  12. Young People and Caregivers' Perceptions of an Intervention Program for Children Who Deliberately Light Fires

    ERIC Educational Resources Information Center

    Lambie, Ian; Seymour, Fred; Popaduk, Tanya

    2012-01-01

    A significant number of children and adolescents engage in deliberate fire setting, beyond the scope of curiosity and experimentation. Interventions developed to respond to the needs of such fire setters generally involve educational and/or psychosocial approaches. Research evaluating the effectiveness of these interventions is dominated by…

  13. Reducing Environmental Risks by Information Disclosure: Evidence in Residential Lead Paint Disclosure Rule

    ERIC Educational Resources Information Center

    Bae, Hyunhoe

    2012-01-01

    Recently, there has been a surge in environmental regulations that require information disclosure. However, existing empirical evidence is limited to certain applications and has yet to generalize the effectiveness of this approach as a policy strategy to reduce environmental risks. This study evaluates the disclosure rule of the residential lead…

  14. A General Approach to the Evaluation of Ventilation-Perfusion Ratios in Normal and Abnormal Lungs

    ERIC Educational Resources Information Center

    Wagner, Peter D.

    1977-01-01

    Outlines methods for manipulating multiple gas data so as to gain the greatest amount of insight into the properties of ventilation-perfusion distributions. Refers to data corresponding to normal and abnormal lungs. Uses a two-dimensional framework with the respiratory gases of oxygen and carbon dioxide. (CS)

  15. Design-Comparable Effect Sizes in Multiple Baseline Designs: A General Modeling Framework

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Hedges, Larry V.; Shadish, William R.

    2014-01-01

    In single-case research, the multiple baseline design is a widely used approach for evaluating the effects of interventions on individuals. Multiple baseline designs involve repeated measurement of outcomes over time and the controlled introduction of a treatment at different times for different individuals. This article outlines a general…

  16. Applying Coaching Strategies to Support Youth- and Family-Focused Extension Programming

    ERIC Educational Resources Information Center

    Olson, Jonathan R.; Hawkey, Kyle R.; Smith, Burgess; Perkins, Daniel F.; Borden, Lynne M.

    2016-01-01

    In this article, we describe how a peer-coaching model has been applied to support community-based Extension programming through the Children, Youth, and Families at Risk (CYFAR) initiative. We describe the general approaches to coaching that have been used to help with CYFAR program implementation, evaluation, and sustainability efforts; we…

  17. A SYSTEMS APPROACH UTILIZING GENERAL-PURPOSE AND SPECIAL-PURPOSE TEACHING MACHINES.

    ERIC Educational Resources Information Center

    SILVERN, LEONARD C.

    IN ORDER TO IMPROVE THE EMPLOYEE TRAINING-EVALUATION METHOD, TEACHING MACHINES AND PERFORMANCE AIDS MUST BE PHYSICALLY AND OPERATIONALLY INTEGRATED INTO THE SYSTEM, THUS RETURNING TRAINING TO THE ACTUAL JOB ENVIRONMENT. GIVEN THESE CONDITIONS, TRAINING CAN BE MEASURED, CALIBRATED, AND CONTROLLED WITH RESPECT TO ACTUAL JOB PERFORMANCE STANDARDS AND…

  18. Developing a Classroom Management Plan Using a Tiered Approach

    ERIC Educational Resources Information Center

    Sayeski, Kristin L.; Brown, Monica R.

    2011-01-01

    In this article, the authors present a response-to-intervention (RTI) framework that both special and general education teachers can use in evaluating existing class structures and developing comprehensive classroom management plans for the purpose of managing challenging behaviors. They applied the concept of a three-tiered model of support at…

  19. Behavioral Profiles in 4-5 Year-Old Children: Normal and Pathological Variants

    ERIC Educational Resources Information Center

    Larsson, Jan-Olov; Bergman, Lars R.; Earls, Felton; Rydelius, Per-Anders

    2004-01-01

    Normal and psychopathological patterns of behavior symptoms in preschool children were described by a classification approach using cluster analysis. The behavior of 406 children, average age 4 years 9 months, from the general population was evaluated at home visits. Seven clusters were identified based on empirically defined dimensions:…

  20. Two-stage autotransplantation of human submandibular gland: a novel approach to treat postradiogenic xerostomia.

    PubMed

    Hagen, Rudolf; Scheich, Matthias; Kleinsasser, Norbert; Burghartz, Marc

    2016-08-01

    Xerostomia is a persistent side effect of radiotherapy (RT), which severely reduces the quality of life of the patients affected. Besides drug treatment and new irradiation strategies, surgical procedures aim for tissue protection of the submandibular gland. Using a new surgical approach, the submandibular gland was autotransplanted in 6 patients to the patient's forearm for the period of RT and reimplanted into the floor of the mouth 2-3 months after completion of RT. Saxon's test was performed during different time points to evaluate patient's saliva production. Furthermore patients had to answer EORTC QLQ-HN35 questionnaire and visual analog scale. Following this two-stage autotransplantation, xerostomia in the patients was markedly reduced due to improved saliva production of the reimplanted gland. Whether this promising novel approach is a reliable treatment option for RT patients in general should be evaluated in further studies.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Wiberg, Holli K.; Matzke, Melissa M.

    In this review, we apply selected imputation strategies to label-free liquid chromatography–mass spectrometry (LC–MS) proteomics datasets to evaluate the accuracy with respect to metrics of variance and classification. We evaluate several commonly used imputation approaches for individual merits and discuss the caveats of each approach with respect to the example LC–MS proteomics data. In general, local similarity-based approaches, such as the regularized expectation maximization and least-squares adaptive algorithms, yield the best overall performances with respect to metrics of accuracy and robustness. However, no single algorithm consistently outperforms the remaining approaches, and in some cases, performing classification without imputation sometimes yieldedmore » the most accurate classification. Thus, because of the complex mechanisms of missing data in proteomics, which also vary from peptide to protein, no individual method is a single solution for imputation. In summary, on the basis of the observations in this review, the goal for imputation in the field of computational proteomics should be to develop new approaches that work generically for this data type and new strategies to guide users in the selection of the best imputation for their dataset and analysis objectives.« less

  2. Determining Functional Reliability of Pyrotechnic Mechanical Devices

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Multhaup, Herbert A.

    1997-01-01

    This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.

  3. Reducing health inequalities in people with learning disabilities: a multi-disciplinary team approach to care under general anaesthesia.

    PubMed

    Clough, S; Shehabi, Z; Morgan, C

    2016-05-27

    Background There remains significant inequality in health and healthcare in people with learning disabilities (LD). A lack of coordination and the episodic nature of care provision are contributory factors. Recognising the need to improve outcomes for this group, we evaluate a multi-disciplinary team (MDT) approach to care whereby additional medical procedures are carried out under the same episode of general anaesthesia (GA) as dental treatment for people with severe LD. This is the first published evaluation of its kind in the UK.Aim To evaluate the need and outcomes of an MDT approach to care among people with severe LD receiving dental treatment under GA.Method One hundred patients with severe LD and behaviour that challenges attended Barts Health Dental Hospital for dental assessment and subsequent treatment under GA. Details of failed or forthcoming medical interventions were determined. Where appropriate, care was coordinated with the relevant medical team.Findings Twenty-one percent (n = 21/100) had recent medical interventions attempted that had been abandoned, and 7.0% (n = 7/100) had future investigations or treatment planned under GA with medical specialties. An MDT approach was indicated in 28.0% (n = 28/100). For such complex cases, a successful MDT outcome was achieved in 89.3% (n = 25/28). This included ophthalmological/orthoptic, ENT and gastroenterological interventions in addition to medical imaging.Conclusion An MDT approach to care for people with LD offers improved patient-centred outcomes in addition to financial and resource efficiency. It requires a high level of cooperation between specialties, with consideration of the practicalities of a shared surgical space and equipment needs. Re-shaping of services and contractual flexibility are essential to support the future implementation of MDTs and to ensure long-term sustainability. Adoption of a holistic culture in the care of this vulnerable patient group is encouraged.

  4. Structural Loads Analysis for Wave Energy Converters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer A; Yu, Yi-Hsiang; Guo, Yi

    2017-06-03

    This study explores and verifies the generalized body-modes method for evaluating the structural loads on a wave energy converter (WEC). Historically, WEC design methodologies have focused primarily on accurately evaluating hydrodynamic loads, while methodologies for evaluating structural loads have yet to be fully considered and incorporated into the WEC design process. As wave energy technologies continue to advance, however, it has become increasingly evident that an accurate evaluation of the structural loads will enable an optimized structural design, as well as the potential utilization of composites and flexible materials, and hence reduce WEC costs. Although there are many computational fluidmore » dynamics, structural analyses and fluid-structure-interaction (FSI) codes available, the application of these codes is typically too computationally intensive to be practical in the early stages of the WEC design process. The generalized body-modes method, however, is a reduced order, linearized, frequency-domain FSI approach, performed in conjunction with the linear hydrodynamic analysis, with computation times that could realistically be incorporated into the WEC design process.« less

  5. Aligning Theory and Design: The Development of an Online Learning Intervention to Teach Evidence-based Practice for Maximal Reach.

    PubMed

    Delagran, Louise; Vihstadt, Corrie; Evans, Roni

    2015-09-01

    Online educational interventions to teach evidence-based practice (EBP) are a promising mechanism for overcoming some of the barriers to incorporating research into practice. However, attention must be paid to aligning strategies with adult learning theories to achieve optimal outcomes. We describe the development of a series of short self-study modules, each covering a small set of learning objectives. Our approach, informed by design-based research (DBR), involved 6 phases: analysis, design, design evaluation, redesign, development/implementation, and evaluation. Participants were faculty and students in 3 health programs at a complementary and integrative educational institution. We chose a reusable learning object approach that allowed us to apply 4 main learning theories: events of instruction, cognitive load, dual processing, and ARCS (attention, relevance, confidence, satisfaction). A formative design evaluation suggested that the identified theories and instructional approaches were likely to facilitate learning and motivation. Summative evaluation was based on a student survey (N=116) that addressed how these theories supported learning. Results suggest that, overall, the selected theories helped students learn. The DBR approach allowed us to evaluate the specific intervention and theories for general applicability. This process also helped us define and document the intervention at a level of detail that covers almost all the proposed Guideline for Reporting Evidence-based practice Educational intervention and Teaching (GREET) items. This thorough description will facilitate the interpretation of future research and implementation of the intervention. Our approach can also serve as a model for others considering online EBP intervention development.

  6. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    PubMed

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V; Robles, Montserrat; Aparici, F; Martí-Bonmatí, L; García-Gómez, Juan M

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation.

  7. Aligning Theory and Design: The Development of an Online Learning Intervention to Teach Evidence-based Practice for Maximal Reach

    PubMed Central

    Vihstadt, Corrie; Evans, Roni

    2015-01-01

    Background: Online educational interventions to teach evidence-based practice (EBP) are a promising mechanism for overcoming some of the barriers to incorporating research into practice. However, attention must be paid to aligning strategies with adult learning theories to achieve optimal outcomes. Methods: We describe the development of a series of short self-study modules, each covering a small set of learning objectives. Our approach, informed by design-based research (DBR), involved 6 phases: analysis, design, design evaluation, redesign, development/implementation, and evaluation. Participants were faculty and students in 3 health programs at a complementary and integrative educational institution. Results: We chose a reusable learning object approach that allowed us to apply 4 main learning theories: events of instruction, cognitive load, dual processing, and ARCS (attention, relevance, confidence, satisfaction). A formative design evaluation suggested that the identified theories and instructional approaches were likely to facilitate learning and motivation. Summative evaluation was based on a student survey (N=116) that addressed how these theories supported learning. Results suggest that, overall, the selected theories helped students learn. Conclusion: The DBR approach allowed us to evaluate the specific intervention and theories for general applicability. This process also helped us define and document the intervention at a level of detail that covers almost all the proposed Guideline for Reporting Evidence-based practice Educational intervention and Teaching (GREET) items. This thorough description will facilitate the interpretation of future research and implementation of the intervention. Our approach can also serve as a model for others considering online EBP intervention development. PMID:26421233

  8. Multiscale Simulations of Protein Landscapes: Using Coarse Grained Models as Reference Potentials to Full Explicit Models

    PubMed Central

    Messer, Benjamin M.; Roca, Maite; Chu, Zhen T.; Vicatos, Spyridon; Kilshtain, Alexandra Vardi; Warshel, Arieh

    2009-01-01

    Evaluating the free energy landscape of proteins and the corresponding functional aspects presents a major challenge for computer simulation approaches. This challenge is due to the complexity of the landscape and the enormous computer time needed for converging simulations. The use of simplified coarse grained (CG) folding models offers an effective way of sampling the landscape but such a treatment, however, may not give the correct description of the effect of the actual protein residues. A general way around this problem that has been put forward in our early work (Fan et al, Theor Chem Acc (1999) 103:77-80) uses the CG model as a reference potential for free energy calculations of different properties of the explicit model. This method is refined and extended here, focusing on improving the electrostatic treatment and on demonstrating key applications. This application includes: evaluation of changes of folding energy upon mutations, calculations of transition states binding free energies (which are crucial for rational enzyme design), evaluation of catalytic landscape and simulation of the time dependent responses to pH changes. Furthermore, the general potential of our approach in overcoming major challenges in studies of structure function correlation in proteins is discussed. PMID:20052756

  9. Evaluation of an E-learning resource on approach to the first unprovoked seizure.

    PubMed

    Le Marne, Fleur A; McGinness, Hannah; Slade, Rob; Cardamone, Michael; Balbir Singh, Shirleen; Connolly, Anne M; Bye, Ann Me

    2016-09-01

    To develop and evaluate an online educational package instructing paediatricians and trainees in the diagnosis and management of a first unprovoked seizure in children. The E-learning content was created following a comprehensive literature review that referenced current international guidelines. Rigorous consultation with local paediatric neurologists, paediatricians and epilepsy nurses was undertaken. A series of learning modules was created and sequenced to reflect steps needed to achieve optimal diagnosis and management in a real-life situation of a child presenting with a paroxysmal event. Paediatric registrars and advanced trainees from the Sydney Children's Hospitals Network were assessed before and after using the E-learning Resource. Measures included general epilepsy knowledge, case-based scenario knowledge; self-rated measures of satisfaction with instruction and confidence regarding clinical approach to the child with first unprovoked seizure; and open ended questions evaluating the usefulness of the E-learning resource. Performance on measures of general epilepsy knowledge and on the seizure-related case scenarios improved significantly following completion of the E-learning as did self-rated satisfaction with instruction and confidence across all aspects of managing first seizure. The E-learning resource has been validated as a useful educational resource regarding the first afebrile unprovoked seizure for paediatricians. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).

  10. Embracing heterothermic diversity: non-stationary waveform analysis of temperature variation in endotherms.

    PubMed

    Levesque, Danielle L; Menzies, Allyson K; Landry-Cuerrier, Manuelle; Larocque, Guillaume; Humphries, Murray M

    2017-07-01

    Recent research is revealing incredible diversity in the thermoregulatory patterns of wild and captive endotherms. As a result of these findings, classic thermoregulatory categories of 'homeothermy', 'daily heterothermy', and 'hibernation' are becoming harder to delineate, impeding our understanding of the physiological and evolutionary significance of variation within and around these categories. However, we lack a generalized analytical approach for evaluating and comparing the complex and diversified nature of the full breadth of heterothermy expressed by individuals, populations, and species. Here we propose a new approach that decomposes body temperature time series into three inherent properties-waveform, amplitude, and period-using a non-stationary technique that accommodates the temporal variability of body temperature patterns. This approach quantifies circadian and seasonal variation in thermoregulatory patterns, and uses the distribution of observed thermoregulatory patterns as a basis for intra- and inter-specific comparisons. We analyse body temperature time series from multiple species, including classical hibernators, tropical heterotherms, and homeotherms, to highlight the approach's general usefulness and the major axes of thermoregulatory variation that it reveals.

  11. A personalized health-monitoring system for elderly by combining rules and case-based reasoning.

    PubMed

    Ahmed, Mobyen Uddin

    2015-01-01

    Health-monitoring system for elderly in home environment is a promising solution to provide efficient medical services that increasingly interest by the researchers within this area. It is often more challenging when the system is self-served and functioning as personalized provision. This paper proposed a personalized self-served health-monitoring system for elderly in home environment by combining general rules with a case-based reasoning approach. Here, the system generates feedback, recommendation and alarm in a personalized manner based on elderly's medical information and health parameters such as blood pressure, blood glucose, weight, activity, pulse, etc. A set of general rules has used to classify individual health parameters. The case-based reasoning approach is used to combine all different health parameters, which generates an overall classification of health condition. According to the evaluation result considering 323 cases and k=2 i.e., top 2 most similar retrieved cases, the sensitivity, specificity and overall accuracy are achieved as 90%, 97% and 96% respectively. The preliminary result of the system is acceptable since the feedback; recommendation and alarm messages are personalized and differ from the general messages. Thus, this approach could be possibly adapted for other situations in personalized elderly monitoring.

  12. Current management of acute diverticulitis: a survey of Australasian surgeons.

    PubMed

    Jaung, Rebekah; Robertson, Jason; Rowbotham, David; Bissett, Ian

    2016-03-11

    To evaluate the current practice and degree of consensus amongst Australasian surgeons regarding non-surgical management of acute diverticulitis (AD) and to determine whether newer approaches to management are being translated into practice. An online survey was distributed to all Australasian colorectal surgeons and all general surgeons in the Auckland region. Responses were collected over two months and analysed to identify points of consensus and areas of significant difference in opinion between these groups. Responses were received from a total of 99 of 200 (49.5%) colorectal surgeons, and 19 of 36 (52.7%) general surgeons. The Hinchey Classification was the most commonly used measure of disease severity, used by 67 (95.7%) colorectal surgeons and 12 (92.3%) general surgeons. There was lack of consensus around important aspects of AD management, including antibiotic therapy, and use and modality of follow-up imaging. Selective antibiotic therapy and use of anti-inflammatory medication as adjuncts to treatment were practised by a minority of those surveyed. Newer approaches to management were being utilised by some respondents. The lack of consensus regarding management of AD may be a consequence of a paucity of high-level evidence to support specific management approaches, particularly in patients with uncomplicated AD.

  13. Evaluating interventions in health: a reconciliatory approach.

    PubMed

    Wolff, Jonathan; Edwards, Sarah; Richmond, Sarah; Orr, Shepley; Rees, Geraint

    2012-11-01

    Health-related Quality of Life measures have recently been attacked from two directions, both of which criticize the preference-based method of evaluating health states they typically incorporate. One attack, based on work by Daniel Kahneman and others, argues that 'experience' is a better basis for evaluation. The other, inspired by Amartya Sen, argues that 'capability' should be the guiding concept. In addition, opinion differs as to whether health evaluation measures are best derived from consultations with the general public, with patients, or with health professionals. And there is disagreement about whether these opinions should be solicited individually and aggregated, or derived instead from a process of collective deliberation. These distinctions yield a wide variety of possible approaches, with potentially differing policy implications. We consider some areas of disagreement between some of these approaches. We show that many of the perspectives seem to capture something important, such that it may be a mistake to reject any of them. Instead we suggest that some of the existing 'instruments' designed to measure HR QoLs may in fact successfully already combine these attributes, and with further refinement such instruments may be able to provide a reasonable reconciliation between the perspectives. © 2011 Blackwell Publishing Ltd.

  14. Nested polynomial trends for the improvement of Gaussian process-based predictors

    NASA Astrophysics Data System (ADS)

    Perrin, G.; Soize, C.; Marque-Pucheu, S.; Garnier, J.

    2017-10-01

    The role of simulation keeps increasing for the sensitivity analysis and the uncertainty quantification of complex systems. Such numerical procedures are generally based on the processing of a huge amount of code evaluations. When the computational cost associated with one particular evaluation of the code is high, such direct approaches based on the computer code only, are not affordable. Surrogate models have therefore to be introduced to interpolate the information given by a fixed set of code evaluations to the whole input space. When confronted to deterministic mappings, the Gaussian process regression (GPR), or kriging, presents a good compromise between complexity, efficiency and error control. Such a method considers the quantity of interest of the system as a particular realization of a Gaussian stochastic process, whose mean and covariance functions have to be identified from the available code evaluations. In this context, this work proposes an innovative parametrization of this mean function, which is based on the composition of two polynomials. This approach is particularly relevant for the approximation of strongly non linear quantities of interest from very little information. After presenting the theoretical basis of this method, this work compares its efficiency to alternative approaches on a series of examples.

  15. The MIT Lincoln Laboratory RT-04F Diarization Systems: Applications to Broadcast Audio and Telephone Conversations

    DTIC Science & Technology

    2004-11-01

    this paper we describe the systems developed by MITLL and used in DARPA EARS Rich Transcription Fall 2004 (RT-04F) speaker diarization evaluation...many types of audio sources, the focus if the DARPA EARS project and the NIST Rich Transcription evaluations is primarily speaker diarization ...present or samples of any of the speakers . An overview of the general diarization problem and approaches can be found in [1]. In this paper, we

  16. Implicit and Explicit Motivational Tendencies to Faces Varying in Trustworthiness and Dominance in Men

    PubMed Central

    Radke, Sina; Kalt, Theresa; Wagels, Lisa; Derntl, Birgit

    2018-01-01

    Motivational tendencies to happy and angry faces are well-established, e.g., in the form of aggression. Approach-avoidance reactions are not only elicited by emotional expressions, but also linked to the evaluation of stable, social characteristics of faces. Grounded in the two fundamental dimensions of face-based evaluations proposed by Oosterhof and Todorov (2008), the current study tested whether emotionally neutral faces varying in trustworthiness and dominance potentiate approach-avoidance in 50 healthy male participants. Given that evaluations of social traits are influenced by testosterone, we further tested for associations of approach-avoidance tendencies with endogenous and prenatal indicators of testosterone. Computer-generated faces signaling high and low trustworthiness and dominance were used to elicit motivational reactions in three approach-avoidance tasks, i.e., one implicit and one explicit joystick-based paradigm, and an additional rating task. When participants rated their behavioral tendencies, highly trustworthy faces evoked approach, and highly dominant faces evoked avoidance. This pattern, however, did not translate to faster initiation times of corresponding approach-avoidance movements. Instead, the joystick tasks revealed general effects, such as faster reactions to faces signaling high trustworthiness or high dominance. These findings partially support the framework of Oosterhof and Todorov (2008) in guiding approach-avoidance decisions, but not behavioral tendencies. Contrary to our expectations, neither endogenous nor prenatal indicators of testosterone were associated with motivational tendencies. Future studies should investigate the contexts in which testosterone influences social motivation. PMID:29410619

  17. Happy but not so approachable: the social judgments of individuals with generalized social phobia.

    PubMed

    Campbell, D W; Sareen, J; Stein, M B; Kravetsky, L B; Paulus, M P; Hassard, S T; Reiss, J P

    2009-01-01

    We examined social approachability judgments in a psychiatric population that frequently experiences interpersonal difficulties and reduced social satisfaction, individuals with generalized social phobia (gSP). Our objective was to broaden the understanding of the social cognitive tendencies of individuals with gSP by systematically investigating their interpretation of positive facial expressions. We hypothesized that approachability ratings would be lower for positive as well as negative emotional faces in the gSP group compared to the healthy comparison group. Each participant evaluated 24 emotional faces presented on a computer screen. Participants first labeled the faces as either happy, disgust, or angry in emotional expression, and then they rated each face's approachability. Analysis of variance and post hoc analyses were used to identify group, emotion, and group by emotion rating differences. Happy face approachability ratings were higher than disgust and anger in both groups. The central finding was that individuals with gSP rated happy faces as less approachable than the healthy participants and that degree of social anxiety was associated with lower approachability ratings within the gSP sample. Explicit approachability judgments of negative faces did not differ as predicted. Consistent with earlier indirect evidence of interpretation biases of positive social emotional information, this study reveals that individuals with gSP demonstrate explicit, subjective social interpretation biases of overtly positive social feedback. The therapeutic relevance of these results is discussed.

  18. Deformation-Aware Log-Linear Models

    NASA Astrophysics Data System (ADS)

    Gass, Tobias; Deselaers, Thomas; Ney, Hermann

    In this paper, we present a novel deformation-aware discriminative model for handwritten digit recognition. Unlike previous approaches our model directly considers image deformations and allows discriminative training of all parameters, including those accounting for non-linear transformations of the image. This is achieved by extending a log-linear framework to incorporate a latent deformation variable. The resulting model has an order of magnitude less parameters than competing approaches to handling image deformations. We tune and evaluate our approach on the USPS task and show its generalization capabilities by applying the tuned model to the MNIST task. We gain interesting insights and achieve highly competitive results on both tasks.

  19. The posterior transtriceps approach for elbow arthrography: a forgotten technique?

    PubMed

    Lohman, M; Borrero, C; Casagranda, B; Rafiee, B; Towers, J

    2009-05-01

    To evaluate the technical feasibility of performing elbow MR arthrography via a posterior approach through the triceps. The images of 19 patients with elbow MR arthrography via a posterior transtriceps approach were retrospectively studied. The injections were performed by four musculoskeletal radiologists, using fluoroscopic guidance and a 22- or 25-gauge needle. The fluoroscopic and subsequent MR images were reviewed by two musculoskeletal radiologists and evaluated for adequacy of joint capsular distention, degree and location of contrast leakage, and presence of gas bubbles. The injection was diagnostic in all 19 patients, with a sufficient amount of contrast agent seen in the elbow joint. No significant contrast leakage occurred in 12 patients who received injections of 8 cc or less of contrast agent, but moderate contrast leakage occurred in 6/7 patients who received injections of greater than 8 cc. Contrast leakage generally occurred within the triceps myotendinous junction. No gas bubbles were identified in the injected joints. Patients often present for MR arthrography of the elbow with medial or lateral elbow pain. Contrast leakage during a radiocapitellar approach may complicate evaluation of the lateral collateral ligament or the common extensor tendon origin. Transtriceps MR arthrography offers an alternative to the more commonly used radiocapitellar approach. With injected volumes not exceeding 8 cc, the risk of significant contrast leakage is small. An advantage of the transtriceps injection is that contrast leakage through the posterior needle tract does not interfere with evaluation of the lateral structures.

  20. Local fields and effective conductivity tensor of ellipsoidal particle composite with anisotropic constituents

    NASA Astrophysics Data System (ADS)

    Kushch, Volodymyr I.; Sevostianov, Igor; Giraud, Albert

    2017-11-01

    An accurate semi-analytical solution of the conductivity problem for a composite with anisotropic matrix and arbitrarily oriented anisotropic ellipsoidal inhomogeneities has been obtained. The developed approach combines the superposition principle with the multipole expansion of perturbation fields of inhomogeneities in terms of ellipsoidal harmonics and reduces the boundary value problem to an infinite system of linear algebraic equations for the induced multipole moments of inhomogeneities. A complete full-field solution is obtained for the multi-particle models comprising inhomogeneities of diverse shape, size, orientation and properties which enables an adequate account for the microstructure parameters. The solution is valid for the general-type anisotropy of constituents and arbitrary orientation of the orthotropy axes. The effective conductivity tensor of the particulate composite with anisotropic constituents is evaluated in the framework of the generalized Maxwell homogenization scheme. Application of the developed method to composites with imperfect ellipsoidal interfaces is straightforward. Their incorporation yields probably the most general model of a composite that may be considered in the framework of analytical approach.

  1. An analysis of annual maximum streamflows in Terengganu, Malaysia using TL-moments approach

    NASA Astrophysics Data System (ADS)

    Ahmad, Ummi Nadiah; Shabri, Ani; Zakaria, Zahrahtul Amani

    2013-02-01

    TL-moments approach has been used in an analysis to determine the best-fitting distributions to represent the annual series of maximum streamflow data over 12 stations in Terengganu, Malaysia. The TL-moments with different trimming values are used to estimate the parameter of the selected distributions namely: generalized pareto (GPA), generalized logistic, and generalized extreme value distribution. The influence of TL-moments on estimated probability distribution functions are examined by evaluating the relative root mean square error and relative bias of quantile estimates through Monte Carlo simulations. The boxplot is used to show the location of the median and the dispersion of the data, which helps in reaching the decisive conclusions. For most of the cases, the results show that TL-moments with one smallest value was trimmed from the conceptual sample (TL-moments (1,0)), of GPA distribution was the most appropriate in majority of the stations for describing the annual maximum streamflow series in Terengganu, Malaysia.

  2. An Evaluation of Four MTS Recurrent Training Courses,

    DTIC Science & Technology

    1978-09-01

    management—by— objectives and team action . The evaluation of each of these four recurrent training courses follows the general approach taken in the earlier...D AU bI 519 FEant AVIATION ADMINISTRATION IflHINSTON 0 C OFFICE—ETC P/S 5/9AN EVALUATION Of FOJR MIS RECURRENT TRAIN ING C OURSES. (U)SEP 78 R C S M...Sep -~-__ L.__ T)78 ~. L I AN E VALUAT ION OF FOUR MTS RE CURRE NT TRAINING COURSESi —fl. - -— .-.~~. ..~~~~~~~ ‘ ) 6. erfo,a,~ng OrgoniZotro~I Cod

  3. Assessing older drivers: a primary care protocol to evaluate driving safety risk.

    PubMed

    Murden, Robert A; Unroe, Kathleen

    2005-08-01

    Most articles on elder drivers offer either general advice, or review testing protocols that divide drivers into two distinct groups: safe or unsafe. We believe it is unreasonable to expect any testing to fully separate drivers into just these two mutually exclusive groups, so we offer a protocol for a more practical approach. This protocol can be applied by primary care physicians. We review the justification for the many steps of this protocol, which have branches that lead to identifying drivers as low risk, high risk (for accidents) or needing further evaluation. Options for further evaluation are provided.

  4. The trajectory of scientific discovery: concept co-occurrence and converging semantic distance.

    PubMed

    Cohen, Trevor; Schvaneveldt, Roger W

    2010-01-01

    The paradigm of literature-based knowledge discovery originated by Swanson involves finding meaningful associations between terms or concepts that have not occurred together in any previously published document. While several automated approaches have been applied to this problem, these generally evaluate the literature at a point in time, and do not evaluate the role of change over time in distributional statistics as an indicator of meaningful implicit associations. To address this issue, we develop and evaluate Symmetric Random Indexing (SRI), a novel variant of the Random Indexing (RI) approach that is able to measure implicit association over time. SRI is found to compare favorably to existing RI variants in the prediction of future direct co-occurrence. Summary statistics over several experiments suggest a trend of converging semantic distance prior to the co-occurrence of key terms for two seminal historical literature-based discoveries.

  5. Management of unmanned moving sensors through human decision layers: a bi-level optimization process with calls to costly sub-processes

    NASA Astrophysics Data System (ADS)

    Dambreville, Frédéric

    2013-10-01

    While there is a variety of approaches and algorithms for optimizing the mission of an unmanned moving sensor, there are much less works which deal with the implementation of several sensors within a human organization. In this case, the management of the sensors is done through at least one human decision layer, and the sensors management as a whole arises as a bi-level optimization process. In this work, the following hypotheses are considered as realistic: Sensor handlers of first level plans their sensors by means of elaborated algorithmic tools based on accurate modelling of the environment; Higher level plans the handled sensors according to a global observation mission and on the basis of an approximated model of the environment and of the first level sub-processes. This problem is formalized very generally as the maximization of an unknown function, defined a priori by sampling a known random function (law of model error). In such case, each actual evaluation of the function increases the knowledge about the function, and subsequently the efficiency of the maximization. The issue is to optimize the sequence of value to be evaluated, in regards to the evaluation costs. There is here a fundamental link with the domain of experiment design. Jones, Schonlau and Welch proposed a general method, the Efficient Global Optimization (EGO), for solving this problem in the case of additive functional Gaussian law. In our work, a generalization of the EGO is proposed, based on a rare event simulation approach. It is applied to the aforementioned bi-level sensor planning.

  6. The effects of substrate pre-treatment on anaerobic digestion systems: a review.

    PubMed

    Carlsson, My; Lagerkvist, Anders; Morgan-Sagastume, Fernando

    2012-09-01

    Focus is placed on substrate pre-treatment in anaerobic digestion (AD) as a means of increasing biogas yields using today's diversified substrate sources. Current pre-treatment methods to improve AD are being examined with regard to their effects on different substrate types, highlighting approaches and associated challenges in evaluating substrate pre-treatment in AD systems and its influence on the overall system of evaluation. WWTP residues represent the substrate type that is most frequently assessed in pre-treatment studies, followed by energy crops/harvesting residues, organic fraction of municipal solid waste, organic waste from food industry and manure. The pre-treatment effects are complex and generally linked to substrate characteristics and pre-treatment mechanisms. Overall, substrates containing lignin or bacterial cells appear to be the most amendable to pre-treatment for enhancing AD. Approaches used to evaluate AD enhancement in different systems is further reviewed and challenges and opportunities for improved evaluations are identified. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. A Non-parametric Cutout Index for Robust Evaluation of Identified Proteins*

    PubMed Central

    Serang, Oliver; Paulo, Joao; Steen, Hanno; Steen, Judith A.

    2013-01-01

    This paper proposes a novel, automated method for evaluating sets of proteins identified using mass spectrometry. The remaining peptide-spectrum match score distributions of protein sets are compared to an empirical absent peptide-spectrum match score distribution, and a Bayesian non-parametric method reminiscent of the Dirichlet process is presented to accurately perform this comparison. Thus, for a given protein set, the process computes the likelihood that the proteins identified are correctly identified. First, the method is used to evaluate protein sets chosen using different protein-level false discovery rate (FDR) thresholds, assigning each protein set a likelihood. The protein set assigned the highest likelihood is used to choose a non-arbitrary protein-level FDR threshold. Because the method can be used to evaluate any protein identification strategy (and is not limited to mere comparisons of different FDR thresholds), we subsequently use the method to compare and evaluate multiple simple methods for merging peptide evidence over replicate experiments. The general statistical approach can be applied to other types of data (e.g. RNA sequencing) and generalizes to multivariate problems. PMID:23292186

  8. Male sexual strategies modify ratings of female models with specific waist-to-hip ratios.

    PubMed

    Brase, Gary L; Walker, Gary

    2004-06-01

    Female waist-to-hip ratio (WHR) has generally been an important general predictor of ratings of physical attractiveness and related characteristics. Individual differences in ratings do exist, however, and may be related to differences in the reproductive tactics of the male raters such as pursuit of short-term or long-term relationships and adjustments based on perceptions of one's own quality as a mate. Forty males, categorized according to sociosexual orientation and physical qualities (WHR, Body Mass Index, and self-rated desirability), rated female models on both attractiveness and likelihood they would approach them. Sociosexually restricted males were less likely to approach females rated as most attractive (with 0.68-0.72 WHR), as compared with unrestricted males. Males with lower scores in terms of physical qualities gave ratings indicating more favorable evaluations of female models with lower WHR. The results indicate that attractiveness and willingness to approach are overlapping but distinguishable constructs, both of which are influenced by variations in characteristics of the raters.

  9. Generalizing boundaries for triangular designs, and efficacy estimation at extended follow-ups.

    PubMed

    Allison, Annabel; Edwards, Tansy; Omollo, Raymond; Alves, Fabiana; Magirr, Dominic; E Alexander, Neal D

    2015-11-16

    Visceral leishmaniasis (VL) is a parasitic disease transmitted by sandflies and is fatal if left untreated. Phase II trials of new treatment regimens for VL are primarily carried out to evaluate safety and efficacy, while pharmacokinetic data are also important to inform future combination treatment regimens. The efficacy of VL treatments is evaluated at two time points, initial cure, when treatment is completed and definitive cure, commonly 6 months post end of treatment, to allow for slow response to treatment and detection of relapses. This paper investigates a generalization of the triangular design to impose a minimum sample size for pharmacokinetic or other analyses, and methods to estimate efficacy at extended follow-up accounting for the sequential design and changes in cure status during extended follow-up. We provided R functions that generalize the triangular design to impose a minimum sample size before allowing stopping for efficacy. For estimation of efficacy at a second, extended, follow-up time, the performance of a shrinkage estimator (SHE), a probability tree estimator (PTE) and the maximum likelihood estimator (MLE) for estimation was assessed by simulation. The SHE and PTE are viable approaches to estimate an extended follow-up although the SHE performed better than the PTE: the bias and root mean square error were lower and coverage probabilities higher. Generalization of the triangular design is simple to implement for adaptations to meet requirements for pharmacokinetic analyses. Using the simple MLE approach to estimate efficacy at extended follow-up will lead to biased results, generally over-estimating treatment success. The SHE is recommended in trials of two or more treatments. The PTE is an acceptable alternative for one-arm trials or where use of the SHE is not possible due to computational complexity. NCT01067443 , February 2010.

  10. Evaluation of telephone first approach to demand management in English general practice: observational study.

    PubMed

    Newbould, Jennifer; Abel, Gary; Ball, Sarah; Corbett, Jennie; Elliott, Marc; Exley, Josephine; Martin, Adam; Saunders, Catherine; Wilson, Edward; Winpenny, Eleanor; Yang, Miaoqing; Roland, Martin

    2017-09-27

    Objective  To evaluate a "telephone first" approach, in which all patients wanting to see a general practitioner (GP) are asked to speak to a GP on the phone before being given an appointment for a face to face consultation. Design  Time series and cross sectional analysis of routine healthcare data, data from national surveys, and primary survey data. Participants  147 general practices adopting the telephone first approach compared with a 10% random sample of other practices in England. Intervention  Management support for workload planning and introduction of the telephone first approach provided by two commercial companies. Main outcome measures  Number of consultations, total time consulting (59 telephone first practices, no controls). Patient experience (GP Patient Survey, telephone first practices plus controls). Use and costs of secondary care (hospital episode statistics, telephone first practices plus controls). The main analysis was intention to treat, with sensitivity analyses restricted to practices thought to be closely following the companies' protocols. Results  After the introduction of the telephone first approach, face to face consultations decreased considerably (adjusted change within practices -38%, 95% confidence interval -45% to -29%; P<0.001). An average practice experienced a 12-fold increase in telephone consultations (1204%, 633% to 2290%; P<0.001). The average duration of both telephone and face to face consultations decreased, but there was an overall increase of 8% in the mean time spent consulting by GPs, albeit with large uncertainty on this estimate (95% confidence interval -1% to 17%; P=0.088). These average workload figures mask wide variation between practices, with some practices experiencing a substantial reduction in workload and others a large increase. Compared with other English practices in the national GP Patient Survey, in practices using the telephone first approach there was a large (20.0 percentage points, 95% confidence interval 18.2 to 21.9; P<0.001) improvement in length of time to be seen. In contrast, other scores on the GP Patient Survey were slightly more negative. Introduction of the telephone first approach was followed by a small (2.0%) increase in hospital admissions (95% confidence interval 1% to 3%; P=0.006), no initial change in emergency department attendance, but a small (2% per year) decrease in the subsequent rate of rise of emergency department attendance (1% to 3%; P=0.005). There was a small net increase in secondary care costs. Conclusions  The telephone first approach shows that many problems in general practice can be dealt with over the phone. The approach does not suit all patients or practices and is not a panacea for meeting demand. There was no evidence to support claims that the approach would, on average, save costs or reduce use of secondary care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Evaluation of telephone first approach to demand management in English general practice: observational study

    PubMed Central

    Newbould, Jennifer; Abel, Gary; Ball, Sarah; Corbett, Jennie; Elliott, Marc; Exley, Josephine; Martin, Adam; Saunders, Catherine; Wilson, Edward; Winpenny, Eleanor; Yang, Miaoqing

    2017-01-01

    Objective To evaluate a “telephone first” approach, in which all patients wanting to see a general practitioner (GP) are asked to speak to a GP on the phone before being given an appointment for a face to face consultation. Design Time series and cross sectional analysis of routine healthcare data, data from national surveys, and primary survey data. Participants 147 general practices adopting the telephone first approach compared with a 10% random sample of other practices in England. Intervention Management support for workload planning and introduction of the telephone first approach provided by two commercial companies. Main outcome measures Number of consultations, total time consulting (59 telephone first practices, no controls). Patient experience (GP Patient Survey, telephone first practices plus controls). Use and costs of secondary care (hospital episode statistics, telephone first practices plus controls). The main analysis was intention to treat, with sensitivity analyses restricted to practices thought to be closely following the companies’ protocols. Results After the introduction of the telephone first approach, face to face consultations decreased considerably (adjusted change within practices −38%, 95% confidence interval −45% to −29%; P<0.001). An average practice experienced a 12-fold increase in telephone consultations (1204%, 633% to 2290%; P<0.001). The average duration of both telephone and face to face consultations decreased, but there was an overall increase of 8% in the mean time spent consulting by GPs, albeit with large uncertainty on this estimate (95% confidence interval −1% to 17%; P=0.088). These average workload figures mask wide variation between practices, with some practices experiencing a substantial reduction in workload and others a large increase. Compared with other English practices in the national GP Patient Survey, in practices using the telephone first approach there was a large (20.0 percentage points, 95% confidence interval 18.2 to 21.9; P<0.001) improvement in length of time to be seen. In contrast, other scores on the GP Patient Survey were slightly more negative. Introduction of the telephone first approach was followed by a small (2.0%) increase in hospital admissions (95% confidence interval 1% to 3%; P=0.006), no initial change in emergency department attendance, but a small (2% per year) decrease in the subsequent rate of rise of emergency department attendance (1% to 3%; P=0.005). There was a small net increase in secondary care costs. Conclusions The telephone first approach shows that many problems in general practice can be dealt with over the phone. The approach does not suit all patients or practices and is not a panacea for meeting demand. There was no evidence to support claims that the approach would, on average, save costs or reduce use of secondary care. PMID:28954741

  12. On the equivalence of generalized least-squares approaches to the evaluation of measurement comparisons

    NASA Astrophysics Data System (ADS)

    Koo, A.; Clare, J. F.

    2012-06-01

    Analysis of CIPM international comparisons is increasingly being carried out using a model-based approach that leads naturally to a generalized least-squares (GLS) solution. While this method offers the advantages of being easier to audit and having general applicability to any form of comparison protocol, there is a lack of consensus over aspects of its implementation. Two significant results are presented that show the equivalence of three differing approaches discussed by or applied in comparisons run by Consultative Committees of the CIPM. Both results depend on a mathematical condition equivalent to the requirement that any two artefacts in the comparison are linked through a sequence of measurements of overlapping pairs of artefacts. The first result is that a GLS estimator excluding all sources of error common to all measurements of a participant is equal to the GLS estimator incorporating all sources of error, including those associated with any bias in the standards or procedures of the measuring laboratory. The second result identifies the component of uncertainty in the estimate of bias that arises from possible systematic effects in the participants' measurement standards and procedures. The expression so obtained is a generalization of an expression previously published for a one-artefact comparison with no inter-participant correlations, to one for a comparison comprising any number of repeat measurements of multiple artefacts and allowing for inter-laboratory correlations.

  13. A POGIL approach to teaching engineering hydrology

    NASA Astrophysics Data System (ADS)

    Rutten, M.

    2012-12-01

    This paper presents a case study of the author's experience using Problem Guided Inquiry Learning (POGIL) in an engineering hydrology course. This course is part of an interdisciplinary Water Management program at Bachelor level in the Netherlands. The aims of this approach were to promote constructivism of knowledge, activate critical thinking and reduce math anxiety. POGIL was developed for chemistry education in the United States. To the authors knowledge this is the first application of this approach in Europe. A first trial was done in 2010-2011 and a second trial in 2011-2012 and 55 students participated. The problems that motivated the novel approach, general information on POGIL, its implementation in the course are discussed and the results so far are evaluated.

  14. Approaches to veterinary education--tracking versus a final year broad clinical experience. Part one: effects on career outcome.

    PubMed

    Klosterman, E S; Kass, P H; Walsh, D A

    2009-08-01

    This is the first of two papers that provide extensive data and analysis on the two major approaches to clinical veterinary education, which either provide students with experience of a broad range of species (often defined as omni/general clinical competence), or just a few species (sometimes just one), usually termed 'tracking'. Together the two papers provide a detailed analysis of these two approaches for the first time. The responsibilities of veterinary medicine and veterinary education are rapidly increasing throughoutthe globe. It is critical for all in veterinary education to reassess the approaches that have been used, and evaluate on a school-by-school basis which may best meet its expanding and ever-deepening responsibilities.

  15. Life support approaches for Mars missions

    NASA Astrophysics Data System (ADS)

    Drysdale, A. E.; Ewert, M. K.; Hanford, A. J.

    Life support approaches for Mars missions are evaluated using an equivalent system mass (ESM) approach, in which all significant costs are converted into mass units. The best approach, as defined by the lowest mission ESM, depends on several mission parameters, notably duration, environment and consequent infrastructure costs, and crew size, as well as the characteristics of the technologies which are available. Generally, for the missions under consideration, physicochemical regeneration is most cost effective. However, bioregeneration is likely to be of use for producing salad crops for any mission, for producing staple crops for medium duration missions, and for most food, air and water regeneration for long missions (durations of a decade). Potential applications of in situ resource utilization need to be considered further.

  16. Juvenile Crime, Juvenile Justice. Panel on Juvenile Crime: Prevention, Treatment, and Control.

    ERIC Educational Resources Information Center

    McCord, Joan, Ed.; Widom, Cathy Spatz, Ed.; Crowell, Nancy A., Ed.

    This book discusses patterns and trends in crimes committed by children and adolescents, analyzing youth crime as a subset of general crime and studying the impact of race and gender. It evaluates different approaches to forecasting future crime rates. Data come from a national panel that examined what is known about juvenile crime and its…

  17. Does Extending Foster Care beyond Age 18 Promote Postsecondary Educational Attainment? Chapin Hall Issue Brief

    ERIC Educational Resources Information Center

    Dworsky, Amy; Courtney, Mark

    2010-01-01

    Although foster youth approaching the transition to adulthood have postsecondary educational aspirations similar to those of young people in the general population, for too many foster youth with these aspirations, a college education remains an unfulfilled dream. Previous analyses of data from the Midwest Evaluation of the Adult Functioning of…

  18. A Quantitative and Model-Driven Approach to Assessing Higher Education in the United States of America

    ERIC Educational Resources Information Center

    Huang, Zuqing; Qiu, Robin G.

    2016-01-01

    University ranking or higher education assessment in general has been attracting more and more public attention over the years. However, the subjectivity-based evaluation index and indicator selections and weights that are widely adopted in most existing ranking systems have been called into question. In other words, the objectivity and…

  19. Crystals for stellar spectrometers

    NASA Technical Reports Server (NTRS)

    Alexandropoulos, N. G.; Cohen, G. G.

    1974-01-01

    Crystal evaluation as it applies to instrumentation employed in X-ray astronomy is reviewed, and some solutions are offered to problems that are commonly encountered. A general approach for selecting the most appropriate crystals for a given problem is also suggested. The energy dependence of the diffraction properties of (002) PET, (111) Ge, (101) ADP, (101) KAP, and (001) RAP are reported.

  20. Effects of a "Learn to Think" Intervention Programme on Primary School Students

    ERIC Educational Resources Information Center

    Hu, Weiping; Adey, Philip; Jia, Xiaojuan; Liu, Jia; Zhang, Lei; Li, Jing; Dong, Xiaomei

    2011-01-01

    Background: Methods for teaching thinking may be described as out-of-context or infusion. Both approaches have potential to raise students' general cognitive processing ability and so raise academic achievement, but each has disadvantages. Aims: To describe and evaluate a theory-based learn to think (LTT) curriculum for primary school students,…

  1. The Concerted Services Approach to Developmental Change in Rural Areas: An Interim Evaluation. Center Research and Development Report No. 1.

    ERIC Educational Resources Information Center

    Griessman, B. Eugene, Ed.

    In 1965 Concerted Services in Training and Education (CSTE) began operation in three selected rural counties of New Mexico, Arkansas, and Minnesota with objectives of: (1) developing general operational patterns for alleviation and solution of occupational education problems, (2) identifying employment opportunities and occupational education…

  2. Decoding Skills of Middle-School Students with Autism: An Evaluation of the Nonverbal Reading Approach

    ERIC Educational Resources Information Center

    Leytham, Patrick Allen

    2013-01-01

    Students diagnosed with autism demonstrate a deficit in communication skills, which affects their literacy skills. Federal legislation mandates that students with disabilities receive a free appropriate public education, be taught how to read, and have access to the general education curriculum. Students with autism are being included more in the…

  3. Minimizing Experimental Error in Thinning Research

    Treesearch

    C. B. Briscoe

    1964-01-01

    Many diverse approaches have been made prescribing and evaluating thinnings on an objective basis. None of the techniques proposed hasbeen widely accepted. Indeed. none has been proven superior to the others nor even widely applicable. There are at least two possible reasons for this: none of the techniques suggested is of any general utility and/or experimental error...

  4. Expanding horizons of forest ecosystem management: proceedings of the third habitat futures workshop; 1992 October; Vernon, B.C.

    Treesearch

    Mark H. Huff; Lisa K. Norris; J. Brian Nyberg; Nancy L. Wilkin; coords.

    1994-01-01

    New approaches and technologies to evaluate wildlife-habitat relations, implement integrated forest management, and improve public participation in the process are needed to implement ecosystem management. Presented here are five papers that examine ecosystem management concepts at international, national, regional, and local scales. Two general management problems...

  5. Estimating tree species diversity across geographic scales

    Treesearch

    Susanne Winter; Andreas Böck; Ronald E. McRoberts

    2012-01-01

    The relationship between number of species and area observed has been described using numerous approaches and has been discussed for more than a century. The general objectives of our study were fourfold: (1) to evaluate the behaviour of species-area curves across geographic scales, (2) to determine sample sizes necessary to produce acceptably precise estimates of tree...

  6. Assessing tree and stand biomass: a review with examples and critical comparisons

    Treesearch

    Bernard R. Parresol

    1999-01-01

    There is considerable interest today in estimating the biomass of trees and forests for both practical forestry issues and scientific purposes. New techniques and procedures are brought together along with the more traditional approaches to estimating woody biomass. General model forms and weighted analysis are reviewed, along with statistics for evaluating and...

  7. Securing the Long-Term Bases of the Dual System: A Realistic Evaluation of Apprenticeship Marketing in Switzerland

    ERIC Educational Resources Information Center

    Sager, Fritz

    2008-01-01

    The dual system of vocational training, utilising both company training and vocational school attendance, is generally acknowledged to be a successful model for reducing youth unemployment. However, the decreasing number of training opportunities in countries with this system poses a crisis for the approach. One strategy for overcoming the problem…

  8. A Fuzzy-Based Prior Knowledge Diagnostic Model with Multiple Attribute Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Huang, Yueh-Min

    2013-01-01

    Prior knowledge is a very important part of teaching and learning, as it affects how instructors and students interact with the learning materials. In general, tests are used to assess students' prior knowledge. Nevertheless, conventional testing approaches usually assign only an overall score to each student, and this may mean that students are…

  9. Multivariable feedback design - Concepts for a classical/modern synthesis

    NASA Technical Reports Server (NTRS)

    Doyle, J. C.; Stein, G.

    1981-01-01

    This paper presents a practical design perspective on multivariable feedback control problems. It reviews the basic issue - feedback design in the face of uncertainties - and generalizes known single-input, single-output (SISO) statements and constraints of the design problem to multiinput, multioutput (MIMO) cases. Two major MIMO design approaches are then evaluated in the context of these results.

  10. Measuring Library Space Use and Preferences: Charting a Path toward Increased Engagement

    ERIC Educational Resources Information Center

    Webb, Kathleen M.; Schaller, Molly A.; Hunley, Sawyer A.

    2008-01-01

    The University of Dayton (UD) used a multi-method research approach to evaluate current space use in the library. A general campus survey on study spaces, online library surveys, a week-long video study, and data from the "National Survey of Student Engagement (NSSE)" were examined to understand student choices in library usage. Results…

  11. Psycho-Ecological Systems Model: A Systems Approach to Planning and Gauging the Community Impact of Community-Engaged Scholarship

    ERIC Educational Resources Information Center

    Reeb, Roger N.; Snow-Hill, Nyssa L.; Folger, Susan F.; Steel, Anne L.; Stayton, Laura; Hunt, Charles A.; O'Koon, Bernadette; Glendening, Zachary

    2017-01-01

    This article presents the Psycho-Ecological Systems Model (PESM)--an integrative conceptual model rooted in General Systems Theory (GST). PESM was developed to inform and guide the development, implementation, and evaluation of transdisciplinary (and multilevel) community-engaged scholarship (e.g., a participatory community action research project…

  12. Doing Dissections Differently: A Structured, Peer-Assisted Learning Approach to Maximizing Learning in Dissections

    ERIC Educational Resources Information Center

    Hall, Emma R.; Davis, Rachel C.; Weller, Renate; Powney, Sonya; Williams, Sarah B.

    2013-01-01

    Areas of difficulty faced by our veterinary medicine students, with respect to their learning in dissection classes, were identified. These challenges were both general adult-learning related and specific to the discipline of anatomy. Our aim was to design, implement, and evaluate a modified reciprocal peer-assisted/team-based learning…

  13. Study of launch site processing and facilities for future launch vehicles

    NASA Astrophysics Data System (ADS)

    Shaffer, Rex

    1995-03-01

    The purpose of this research is to provide innovative and creative approaches to assess the impact to the Kennedy Space Center and other launch sites for a range of candidate manned and unmanned space transportation systems. The general scope of the research includes the engineering activities, analyses, and evaluations defined in the four tasks below: (1) development of innovative approaches and computer aided tools; (2) operations analyses of launch vehicle concepts and designs; (3) assessment of ground operations impacts; and (4) development of methodologies to identify promising technologies.

  14. Study of launch site processing and facilities for future launch vehicles

    NASA Technical Reports Server (NTRS)

    Shaffer, Rex

    1995-01-01

    The purpose of this research is to provide innovative and creative approaches to assess the impact to the Kennedy Space Center and other launch sites for a range of candidate manned and unmanned space transportation systems. The general scope of the research includes the engineering activities, analyses, and evaluations defined in the four tasks below: (1) development of innovative approaches and computer aided tools; (2) operations analyses of launch vehicle concepts and designs; (3) assessment of ground operations impacts; and (4) development of methodologies to identify promising technologies.

  15. Shape-based approach for the estimation of individual facial mimics in craniofacial surgery planning

    NASA Astrophysics Data System (ADS)

    Gladilin, Evgeny; Zachow, Stefan; Deuflhard, Peter; Hege, Hans-Christian

    2002-05-01

    Besides the static soft tissue prediction, the estimation of basic facial emotion expressions is another important criterion for the evaluation of craniofacial surgery planning. For a realistic simulation of facial mimics, an adequate biomechanical model of soft tissue including the mimic musculature is needed. In this work, we present an approach for the modeling of arbitrarily shaped muscles and the estimation of basic individual facial mimics, which is based on the geometrical model derived from the individual tomographic data and the general finite element modeling of soft tissue biomechanics.

  16. Assessment of inappropriate antibiotic prescribing among a large cohort of general dentists in the United States.

    PubMed

    Durkin, Michael J; Feng, Qianxi; Warren, Kyle; Lockhart, Peter B; Thornhill, Martin H; Munshi, Kiraat D; Henderson, Rochelle R; Hsueh, Kevin; Fraser, Victoria J

    2018-05-01

    The purpose of this study was to assess dental antibiotic prescribing trends over time, to quantify the number and types of antibiotics dentists prescribe inappropriately, and to estimate the excess health care costs of inappropriate antibiotic prescribing with the use of a large cohort of general dentists in the United States. We used a quasi-Poisson regression model to analyze antibiotic prescriptions trends by general dentists between January 1, 2013, and December 31, 2015, with the use of data from Express Scripts Holding Company, a large pharmacy benefits manager. We evaluated antibiotic duration and appropriateness for general dentists. Appropriateness was evaluated by reviewing the antibiotic prescribed and the duration of the prescription. Overall, the number and rate of antibiotic prescriptions prescribed by general dentists remained stable in our cohort. During the 3-year study period, approximately 14% of antibiotic prescriptions were deemed inappropriate, based on the antibiotic prescribed, antibiotic treatment duration, or both indicators. The quasi-Poisson regression model, which adjusted for number of beneficiaries covered, revealed a small but statistically significant decrease in the monthly rate of inappropriate antibiotic prescriptions by 0.32% (95% confidence interval, 0.14% to 0.50%; P = .001). Overall antibiotic prescribing practices among general dentists in this cohort remained stable over time. The rate of inappropriate antibiotic prescriptions by general dentists decreased slightly over time. From these authors' definition of appropriate antibiotic prescription choice and duration, inappropriate antibiotic prescriptions are common (14% of all antibiotic prescriptions) among general dentists. Further analyses with the use of chart review, administrative data sets, or other approaches are needed to better evaluate antibiotic prescribing practices among dentists. Copyright © 2018 American Dental Association. Published by Elsevier Inc. All rights reserved.

  17. Life expectancy of colon, breast, and testicular cancer patients: an analysis of US-SEER population-based data.

    PubMed

    Capocaccia, R; Gatta, G; Dal Maso, L

    2015-06-01

    Cancer survivorship is an increasingly important issue in cancer control. Life expectancy of patients diagnosed with breast, colon, and testicular cancers, stratified by age at diagnosis and time since diagnosis, is provided as an indicator to evaluate future mortality risks and health care needs of cancer survivors. The standard period life table methodology was applied to estimate excess mortality risk for cancer patients diagnosed in 1985-2011 from SEER registries and mortality data of the general US population. The sensitivity of life expectancy estimates on different assumptions was evaluated. Younger patients with colon cancer showed wider differences in life expectancy compared with that of the general population (11.2 years in women and 10.7 in men at age 45-49 years) than older patients (6.3 and 5.8 at age 60-64 years, respectively). Life expectancy progressively increases in patients surviving the first years, up to 4 years from diagnosis, and then starts to decrease again, approaching that of the general population. For breast cancer, the initial drop in life expectancy is less marked, and again with wider differences in younger patients, varying from 8.7 at age 40-44 years to 2.4 at ages 70-74 years. After diagnosis, life expectancy still decreases with time, but less than that in the general population, slowly approaching that of cancer-free women. Life expectancy of men diagnosed with testicular cancer at age 30 years is estimated as 45.2 years, 2 years less than cancer-free men of the same age. The difference becomes 1.3 years for patients surviving the first year, and then slowly approaches zero with increasing survival time. Life expectancy provides meaningful information on cancer patients, and can help in assessing when a cancer survivor can be considered as cured. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  18. A framework for human-hydrologic system model development integrating hydrology and water management: application to the Cutzamala water system in Mexico

    NASA Astrophysics Data System (ADS)

    Wi, S.; Freeman, S.; Brown, C.

    2017-12-01

    This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.

  19. RichMol: A general variational approach for rovibrational molecular dynamics in external electric fields

    NASA Astrophysics Data System (ADS)

    Owens, Alec; Yachmenev, Andrey

    2018-03-01

    In this paper, a general variational approach for computing the rovibrational dynamics of polyatomic molecules in the presence of external electric fields is presented. Highly accurate, full-dimensional variational calculations provide a basis of field-free rovibrational states for evaluating the rovibrational matrix elements of high-rank Cartesian tensor operators and for solving the time-dependent Schrödinger equation. The effect of the external electric field is treated as a multipole moment expansion truncated at the second hyperpolarizability interaction term. Our fully numerical and computationally efficient method has been implemented in a new program, RichMol, which can simulate the effects of multiple external fields of arbitrary strength, polarization, pulse shape, and duration. Illustrative calculations of two-color orientation and rotational excitation with an optical centrifuge of NH3 are discussed.

  20. Stacking dependence of carrier transport properties in multilayered black phosphorous

    NASA Astrophysics Data System (ADS)

    Sengupta, A.; Audiffred, M.; Heine, T.; Niehaus, T. A.

    2016-02-01

    We present the effect of different stacking orders on carrier transport properties of multi-layer black phosphorous. We consider three different stacking orders AAA, ABA and ACA, with increasing number of layers (from 2 to 6 layers). We employ a hierarchical approach in density functional theory (DFT), with structural simulations performed with generalized gradient approximation (GGA) and the bandstructure, carrier effective masses and optical properties evaluated with the meta-generalized gradient approximation (MGGA). The carrier transmission in the various black phosphorous sheets was carried out with the non-equilibrium green’s function (NEGF) approach. The results show that ACA stacking has the highest electron and hole transmission probabilities. The results show tunability for a wide range of band-gaps, carrier effective masses and transmission with a great promise for lattice engineering (stacking order and layers) in black phosphorous.

  1. Building a biomedical tokenizer using the token lattice design pattern and the adapted Viterbi algorithm

    PubMed Central

    2011-01-01

    Background Tokenization is an important component of language processing yet there is no widely accepted tokenization method for English texts, including biomedical texts. Other than rule based techniques, tokenization in the biomedical domain has been regarded as a classification task. Biomedical classifier-based tokenizers either split or join textual objects through classification to form tokens. The idiosyncratic nature of each biomedical tokenizer’s output complicates adoption and reuse. Furthermore, biomedical tokenizers generally lack guidance on how to apply an existing tokenizer to a new domain (subdomain). We identify and complete a novel tokenizer design pattern and suggest a systematic approach to tokenizer creation. We implement a tokenizer based on our design pattern that combines regular expressions and machine learning. Our machine learning approach differs from the previous split-join classification approaches. We evaluate our approach against three other tokenizers on the task of tokenizing biomedical text. Results Medpost and our adapted Viterbi tokenizer performed best with a 92.9% and 92.4% accuracy respectively. Conclusions Our evaluation of our design pattern and guidelines supports our claim that the design pattern and guidelines are a viable approach to tokenizer construction (producing tokenizers matching leading custom-built tokenizers in a particular domain). Our evaluation also demonstrates that ambiguous tokenizations can be disambiguated through POS tagging. In doing so, POS tag sequences and training data have a significant impact on proper text tokenization. PMID:21658288

  2. Nursing physical assessment for patient safety in general wards: reaching consensus on core skills.

    PubMed

    Douglas, Clint; Booker, Catriona; Fox, Robyn; Windsor, Carol; Osborne, Sonya; Gardner, Glenn

    2016-07-01

    To determine consensus across acute care specialty areas on core physical assessment skills necessary for early recognition of changes in patient status in general wards. Current approaches to physical assessment are inconsistent and have not evolved to meet increased patient and system demands. New models of nursing assessment are needed in general wards that ensure a proactive and patient safety approach. A modified Delphi study. Focus group interviews with 150 acute care registered nurses at a large tertiary referral hospital generated a framework of core skills that were developed into a web-based survey. We then sought consensus with a panel of 35 senior acute care registered nurses following a classical Delphi approach over three rounds. Consensus was predefined as at least 80% agreement for each skill across specialty areas. Content analysis of focus group transcripts identified 40 discrete core physical assessment skills. In the Delphi rounds, 16 of these were consensus validated as core skills and were conceptually aligned with the primary survey: (Airway) Assess airway patency; (Breathing) Measure respiratory rate, Evaluate work of breathing, Measure oxygen saturation; (Circulation) Palpate pulse rate and rhythm, Measure blood pressure by auscultation, Assess urine output; (Disability) Assess level of consciousness, Evaluate speech, Assess for pain; (Exposure) Measure body temperature, Inspect skin integrity, Inspect and palpate skin for signs of pressure injury, Observe any wounds, dressings, drains and invasive lines, Observe ability to transfer and mobilise, Assess bowel movements. Among a large and diverse group of experienced acute care registered nurses consensus was achieved on a structured core physical assessment to detect early changes in patient status. Although further research is needed to refine the model, clinical application should promote systematic assessment and clinical reasoning at the bedside. © 2016 John Wiley & Sons Ltd.

  3. Transient electromagnetic scattering by a radially uniaxial dielectric sphere: Debye series, Mie series and ray tracing methods

    NASA Astrophysics Data System (ADS)

    Yazdani, Mohsen

    Transient electromagnetic scattering by a radially uniaxial dielectric sphere is explored using three well-known methods: Debye series, Mie series, and ray tracing theory. In the first approach, the general solutions for the impulse and step responses of a uniaxial sphere are evaluated using the inverse Laplace transformation of the generalized Mie series solution. Following high frequency scattering solution of a large uniaxial sphere, the Mie series summation is split into the high frequency (HF) and low frequency terms where the HF term is replaced by its asymptotic expression allowing a significant reduction in computation time of the numerical Bromwich integral. In the second approach, the generalized Debye series for a radially uniaxial dielectric sphere is introduced and the Mie series coefficients are replaced by their equivalent Debye series formulations. The results are then applied to examine the transient response of each individual Debye term allowing the identification of impulse returns in the transient response of the uniaxial sphere. In the third approach, the ray tracing theory in a uniaxial sphere is investigated to evaluate the propagation path as well as the arrival time of the ordinary and extraordinary returns in the transient response of the uniaxial sphere. This is achieved by extracting the reflection and transmission angles of a plane wave obliquely incident on the radially oriented air-uniaxial and uniaxial-air boundaries, and expressing the phase velocities as well as the refractive indices of the ordinary and extraordinary waves in terms of the incident angle, optic axis and propagation direction. The results indicate a satisfactory agreement between Debye series, Mie series and ray tracing methods.

  4. The current status of emergent laparoscopic colectomy: a population-based study of clinical and financial outcomes.

    PubMed

    Keller, Deborah S; Pedraza, Rodrigo; Flores-Gonzalez, Juan Ramon; LeFave, Jean Paul; Mahmood, Ali; Haas, Eric M

    2016-08-01

    Population-based studies evaluating laparoscopic colectomy and outcomes compared with open surgery have concentrated on elective resections. As such, data assessing non-elective laparoscopic colectomies are limited. Our goal was to evaluate the current usage and outcomes of laparoscopic in the urgent and emergent setting in the USA. A national inpatient database was reviewed from 2008 to 2011 for right, left, and sigmoid colectomies in the non-elective setting. Cases were stratified by approach into open or laparoscopic groups. Demographics, perioperative clinical variables, and financial outcomes were compared across each group. A total of 22,719 non-elective colectomies were analyzed. The vast majority (95.8 %) was open. Most cases were performed in an urban setting at non-teaching hospitals by general surgeons. Colorectal surgeons were significantly more likely to perform a case laparoscopic than general surgeons (p < 0.001). Demographics were similar between open and laparoscopic groups; however, the disease distribution by approach varied, with significantly more severe cases in the open colectomy arm (p < 0.001). Cases performed laparoscopically had significantly better mortality and complication rates. Laparoscopic cases also had significantly improved outcomes, including shorter length of stay and hospital costs (all p < 0.001). Our analysis revealed less than 5 % of urgent and emergent colectomies in the USA are performed laparoscopically. Colorectal surgeons were more likely to approach a case laparoscopically than general surgeons. Outcomes following laparoscopic colectomy in this setting resulted in reduced length of stay, lower complication rates, and lower costs. Increased adoption of laparoscopy in the non-elective setting should be considered.

  5. Knowledge translation on dementia: a cluster randomized trial to compare a blended learning approach with a "classical" advanced training in GP quality circles.

    PubMed

    Vollmar, Horst C; Butzlaff, Martin E; Lefering, Rolf; Rieger, Monika A

    2007-06-22

    Thus far important findings regarding the dementia syndrome have been implemented into patients' medical care only inadequately. A professional training accounting for both, general practitioners' (GP) needs and learning preferences as well as care-relevant aspects could be a major step towards improving medical care. In the WIDA-study, entitled "Knowledge translation on dementia in general practice" two different training concepts are developed, implemented and evaluated. Both concepts are building on an evidence-based, GP-related dementia guideline and communicate the guideline's essential insights. Both development and implementation emphasize a procedure that is well-accepted in practice and, thus, can achieve a high degree of external validity. This is particularly guaranteed through the preparation of training material and the fact that general practitioners' quality circles (QC) are addressed. The evaluation of the two training concepts is carried out by comparing two groups of GPs to which several quality circles have been randomly assigned. The primary outcome is the GPs' knowledge gain. Secondary outcomes are designed to indicate the training's potential effects on the GPs' practical actions. In the first training concept (study arm A) GPs participate in a structured case discussion prepared for by internet-based learning material ("blended-learning" approach). The second training concept (study arm B) relies on frontal medical training in the form of a slide presentation and follow-up discussion ("classical" approach). This paper presents the outline of a cluster-randomized trial which has been peer reviewed and support by a national funding organization--Federal Ministry of Education and Research (BMBF)--and is approved by an ethics commission. The data collection has started in August 2006 and the results will be published independently of the study's outcome. Current Controlled Trials [ISRCTN36550981].

  6. Girls, boys and conceptual physics: An evaluation of a senior secondary physics course

    NASA Astrophysics Data System (ADS)

    Woolnough, J. A.; Cameron, R. S.

    1991-12-01

    This paper reports an evaluation of the physics course at Dickson College (ACT) looking at students' high school experience, their expectations before beginning and their impressions and feelings during the course. In general, students seem to have a fairly negative approach to physics, enrolling for a variety of often vague utilitarian reasons but with little expectation of enjoyment or interest. These opinions were most prevalent in girls who tend to find the content difficult and the course as a whole uninteresting. There is also a significant difference between girls and boys in their response to different types of assessment items. In an attempt to enhance the level of interest and enjoyment in students we have been phasing in a more ‘conceptual’ approach to the teaching of physics.

  7. Probability and Confidence Trade-space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.

  8. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    PubMed Central

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  9. Walla Walla River Basin Fish Screen Evaluations; Nursery Bridge Fishway and Garden City/Lowden II Sites, 2005-2006 Progress Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamness, Mickie

    2006-06-01

    Pacific Northwest National Laboratory (PNNL) evaluated two fish screen facilities in the Walla Walla River basin in 2005 and early 2006. The Garden City/Lowden screen site was evaluated in April and June 2005 to determine whether the fish screens met National Marine Fisheries Service criteria to provide safe passage for juvenile salmonids. Louvers behind the screens at the Nursery Bridge Fishway were modified in fall 2005 in an attempt to minimize high approach velocities. PNNL evaluated the effects of those modifications in March 2006. Results of the Garden City/Lowden evaluations indicate the site performs well at varying river levels andmore » canal flows. Approach velocities did not exceed 0.4 feet per second (fps) at any time. Sweep velocities increased toward the fish ladder in March but not in June. The air-burst mechanism appears to keep large debris off the screens, although it does not prevent algae and periphyton from growing on the screen face, especially near the bottom of the screens. At Nursery Bridge, results indicate all the approach velocities were below 0.4 fps under the moderate river levels and operational conditions encountered on March 7, 2006. Sweep did not consistently increase toward the fish ladder, but the site generally met the criteria for safe passage of juvenile salmonids. Modifications to the louvers seem to allow more control over the amount of water moving through the screens. We will measure approach velocities when river levels are higher to determine whether the louver modifications can help correct excessive approach velocities under a range of river levels and auxiliary water supply flows.« less

  10. Point cloud registration from local feature correspondences-Evaluation on challenging datasets.

    PubMed

    Petricek, Tomas; Svoboda, Tomas

    2017-01-01

    Registration of laser scans, or point clouds in general, is a crucial step of localization and mapping with mobile robots or in object modeling pipelines. A coarse alignment of the point clouds is generally needed before applying local methods such as the Iterative Closest Point (ICP) algorithm. We propose a feature-based approach to point cloud registration and evaluate the proposed method and its individual components on challenging real-world datasets. For a moderate overlap between the laser scans, the method provides a superior registration accuracy compared to state-of-the-art methods including Generalized ICP, 3D Normal-Distribution Transform, Fast Point-Feature Histograms, and 4-Points Congruent Sets. Compared to the surface normals, the points as the underlying features yield higher performance in both keypoint detection and establishing local reference frames. Moreover, sign disambiguation of the basis vectors proves to be an important aspect in creating repeatable local reference frames. A novel method for sign disambiguation is proposed which yields highly repeatable reference frames.

  11. Further summation formulae related to generalized harmonic numbers

    NASA Astrophysics Data System (ADS)

    Zheng, De-Yin

    2007-11-01

    By employing the univariate series expansion of classical hypergeometric series formulae, Shen [L.-C. Shen, Remarks on some integrals and series involving the Stirling numbers and [zeta](n), Trans. Amer. Math. Soc. 347 (1995) 1391-1399] and Choi and Srivastava [J. Choi, H.M. Srivastava, Certain classes of infinite series, Monatsh. Math. 127 (1999) 15-25; J. Choi, H.M. Srivastava, Explicit evaluation of Euler and related sums, Ramanujan J. 10 (2005) 51-70] investigated the evaluation of infinite series related to generalized harmonic numbers. More summation formulae have systematically been derived by Chu [W. Chu, Hypergeometric series and the Riemann Zeta function, Acta Arith. 82 (1997) 103-118], who developed fully this approach to the multivariate case. The present paper will explore the hypergeometric series method further and establish numerous summation formulae expressing infinite series related to generalized harmonic numbers in terms of the Riemann Zeta function [zeta](m) with m=5,6,7, including several known ones as examples.

  12. Multiprofessional Primary Care Units: What Affects the Clinical Performance of Italian General Practitioners?

    PubMed

    Armeni, Patrizio; Compagni, Amelia; Longo, Francesco

    2014-08-01

    Multiprofessional primary care models promise to deliver better care and reduce waste. This study evaluates the impact of such a model, the primary care unit (PCU), on three outcomes. A multilevel analysis within a "pre- and post-PCU" study design and a cross-sectional analysis were conducted on 215 PCUs located in the Emilia-Romagna region in Italy. Seven dimensions captured a set of processes and services characterizing a well-functioning PCU, or its degree of vitality. The impact of each dimension on outcomes was evaluated. The analyses show that certain dimensions of PCU vitality (i.e., the possibility for general practitioners to meet and share patients) can lead to better outcomes. However, dimensions related to the interaction and the joint works of general practitioners with other professionals tend not to have a significant or positive impact. This suggests that more effort needs to be invested to realize all the potential benefits of the PCU's multiprofessional approach to care. © The Author(s) 2014.

  13. A Comprehensive Prevention Approach to Reducing Assault Offenses and Assault Injuries Among Youth

    PubMed Central

    Heinze, Justin E.; Reischl, Thomas M.; Bai, Mengqiao; Roche, Jessica S.; Morrel-Samuels, Susan; Cunningham, Rebecca M.; Zimmerman, Marc A.

    2018-01-01

    Since 2011, the CDC-funded Michigan Youth Violence Prevention Center (MI-YVPC), working with community partners, has implemented a comprehensive prevention approach to reducing youth violence in Flint, MI, based on public health principles. MI-YVPC employed an intervention strategy that capitalizes on existing community resources and application of evidence-based programs using a social-ecological approach to change. We evaluated the combined effect of six programs in reducing assaults and injury among 10–24 year olds in the intervention area relative to a matched comparison community. We used generalized linear mixed models to examine change in the intervention area counts of reported assault offenses and assault injury presentation relative to the comparison area over a period six years prior- and two and a half years post-intervention. Results indicated that youth victimization and assault injuries fell in the intervention area subsequent to the initiation of the interventions and that these reductions were sustained over time. Our evaluation demonstrated that a comprehensive multi-level approach can be effective for reducing youth violence and injury. PMID:26572898

  14. Evaluating views of lecturers on the consistency of teaching content with teaching approach: traditional versus reform calculus

    NASA Astrophysics Data System (ADS)

    Sevimli, Eyup

    2016-08-01

    This study aims to evaluate the consistency of teaching content with teaching approaches in calculus on the basis of lecturers' views. In this sense, the structures of the examples given in two commonly used calculus textbooks, both in traditional and reform classrooms, are compared. The content analysis findings show that the examples in both textbooks are presented in a rather formal language and generally highlight procedural knowledge. And, even though the examples in the reform book chosen are structured using multiple representations, only a small number of them incorporated the usage of instructional technology. The lecturers' views which were gathered indicated that, although, on the one hand, the example structures of the traditional textbook largely overlapped with the characteristics of the traditional approach, the example structures of the reform textbook, on the other hand, were found to be inconsistent with the characteristics of the reform approach, especially with regard to its environment and knowledge components. At the end of the paper, some suggestions for further studies are provided for book authors and researchers.

  15. Multigroup Propensity Score Approach to Evaluating an Effectiveness Trial of the New Beginnings Program.

    PubMed

    Tein, Jenn-Yun; Mazza, Gina L; Gunn, Heather J; Kim, Hanjoe; Stuart, Elizabeth A; Sandler, Irwin N; Wolchik, Sharlene A

    2018-06-01

    We used a multigroup propensity score approach to evaluate a randomized effectiveness trial of the New Beginnings Program (NBP), an intervention targeting divorced or separated families. Two features of effectiveness trials, high nonattendance rates and inclusion of an active control, make program effects harder to detect. To estimate program effects based on actual intervention participation, we created a synthetic inactive control comprised of nonattenders and assessed the impact of attending the NBP or active control relative to no intervention (inactive control). We estimated propensity scores using generalized boosted models and applied inverse probability of treatment weighting for the comparisons. Relative to the inactive control, NBP strengthened parenting quality as well as reduced child exposure to interparental conflict, parent psychological distress, and child internalizing problems. Some effects were moderated by parent gender, parent ethnicity, or child age. On the other hand, the effects of active versus inactive control were minimal for parenting and in the unexpected direction for child internalizing problems. Findings from the propensity score approach complement and enhance the interpretation of findings from the intention-to-treat approach.

  16. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  17. Trade Study of Excavation Tools and Equipment for Lunar Outpost Development and ISRU

    NASA Astrophysics Data System (ADS)

    Mueller, R. P.; King, R. H.

    2008-01-01

    The NASA Lunar Architecture Team (LAT) has developed a candidate architecture to establish a lunar outpost that includes in-situ resource utilization (ISRU). Outpost development requires excavation for landing and launch sites, roads, trenches, foundations, radiation and thermal shielding, etc. Furthermore, ISRU requires excavation as feed stock for water processing and oxygen production plants. The design environment for lunar excavation tools and equipment including low gravity, cost of launching massive equipment, limited power, limited size, high reliability, and extreme temperatures is significantly different from terrestrial excavation equipment design environment. Consequently, the lunar application requires new approaches to developing excavation tools and equipment in the context of a systems engineering approach to building a Lunar Outpost. Several authors have proposed interesting and innovative general excavation approaches in the literature, and the authors of this paper will propose adaptations and/or new excavation concepts specific to the Lunar Outpost. The requirements for excavation from the LAT architecture will be examined and quantified with corresponding figures of merit and evaluation criteria. This paper will evaluate the proposed approaches using traditional decision making with uncertainty techniques.

  18. Cartographic generalization of urban street networks based on gravitational field theory

    NASA Astrophysics Data System (ADS)

    Liu, Gang; Li, Yongshu; Li, Zheng; Guo, Jiawei

    2014-05-01

    The automatic generalization of urban street networks is a constant and important aspect of geographical information science. Previous studies show that the dual graph for street-street relationships more accurately reflects the overall morphological properties and importance of streets than do other methods. In this study, we construct a dual graph to represent street-street relationship and propose an approach to generalize street networks based on gravitational field theory. We retain the global structural properties and topological connectivity of an original street network and borrow from gravitational field theory to define the gravitational force between nodes. The concept of multi-order neighbors is introduced and the gravitational force is taken as the measure of the importance contribution between nodes. The importance of a node is defined as the result of the interaction between a given node and its multi-order neighbors. Degree distribution is used to evaluate the level of maintaining the global structure and topological characteristics of a street network and to illustrate the efficiency of the suggested method. Experimental results indicate that the proposed approach can be used in generalizing street networks and retaining their density characteristics, connectivity and global structure.

  19. Increasing independent decision-making skills of women with mental retardation in simulated interpersonal situations of abuse.

    PubMed

    Khemka, I

    2000-09-01

    The effectiveness of two decision-making training approaches in increasing independent decision-making skills of 36 women with mild mental retardation in response to hypothetical social interpersonal situations involving abuse was evaluated. Participants were randomly assigned to a control or one of two training conditions (a decision-making training approach that either addressed both cognitive and motivational aspects of decision-making or included only instruction on the cognitive aspect of decision-making). Although both approaches were effective relative to a control condition, the combined cognitive and motivational training approach was superior to the cognitive only training approach. The superiority of this approach was also reflected on a verbally presented generalization task requiring participants to respond to a decision-making situation involving abuse from their own perspective and on a locus of control scale that measured perceptions of control.

  20. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  1. Equivalent Linearization Analysis of Geometrically Nonlinear Random Vibrations Using Commercial Finite Element Codes

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2002-01-01

    Two new equivalent linearization implementations for geometrically nonlinear random vibrations are presented. Both implementations are based upon a novel approach for evaluating the nonlinear stiffness within commercial finite element codes and are suitable for use with any finite element code having geometrically nonlinear static analysis capabilities. The formulation includes a traditional force-error minimization approach and a relatively new version of a potential energy-error minimization approach, which has been generalized for multiple degree-of-freedom systems. Results for a simply supported plate under random acoustic excitation are presented and comparisons of the displacement root-mean-square values and power spectral densities are made with results from a nonlinear time domain numerical simulation.

  2. Framework for Evaluating Water Quality of the New England Crystalline Rock Aquifers

    USGS Publications Warehouse

    Harte, Philip T.; Robinson, Gilpin R.; Ayotte, Joseph D.; Flanagan, Sarah M.

    2008-01-01

    Little information exists on regional ground-water-quality patterns for the New England crystalline rock aquifers (NECRA). A systematic approach to facilitate regional evaluation is needed for several reasons. First, the NECRA are vulnerable to anthropogenic and natural contaminants such as methyl tert-butyl ether (MTBE), arsenic, and radon gas. Second, the physical characteristics of the aquifers, termed 'intrinsic susceptibility', can lead to variable and degraded water quality. A framework approach for characterizing the aquifer region into areas of similar hydrogeology is described in this report and is based on hypothesized relevant physical features and chemical conditions (collectively termed 'variables') that affect regional patterns of ground-water quality. A framework for comparison of water quality across the NECRA consists of a group of spatial variables related to aquifer properties, hydrologic conditions, and contaminant sources. These spatial variables are grouped under four general categories (features) that can be mapped across the aquifers: (1) geologic, (2) hydrophysiographic, (3) land-use land-cover, and (4) geochemical. On a regional scale, these variables represent indicators of natural and anthropogenic sources of contaminants, as well as generalized physical and chemical characteristics of the aquifer system that influence ground-water chemistry and flow. These variables can be used in varying combinations (depending on the contaminant) to categorize the aquifer into areas of similar hydrogeologic characteristics to evaluate variation in regional water quality through statistical testing.

  3. An Analysis of Frame Semantics of Continuous Processes

    DTIC Science & Technology

    2016-08-10

    in natural text involving a variety of continuous processes. Keywords: Frame Semantics; Qualitative Reasoning Introduction & Background Daily...We evaluate our mapping on science texts , but expect our approach to be domain general. Qualitative Process Theory In QP theory, changes within a...fragments from text could reason about real-world scenarios, predicting, for example, that our tub of water may overflow. However, the incremental

  4. Early Place-Value Understanding as a Precursor for Later Arithmetic Performance--A Longitudinal Study on Numerical Development

    ERIC Educational Resources Information Center

    Moeller, K.; Pixner, S.; Zuber, J.; Kaufmann, L.; Nuerk, H. C.

    2011-01-01

    It is assumed that basic numerical competencies are important building blocks for more complex arithmetic skills. The current study aimed at evaluating this interrelation in a longitudinal approach. It was investigated whether first graders' performance in basic numerical tasks in general as well as specific processes involved (e.g., place-value…

  5. Cases and Controversy: Guide to Teaching the Public Issues Series/Harvard Social Studies Project, and Supplement.

    ERIC Educational Resources Information Center

    Oliver, Donald W.; Newmann, Fred M.

    This general guide presents an overview and explains the rationale of the teaching approach of the "Public Issues Series," units produced by the Harvard Social Studies Project to help students in grades 9-19 analyze and discuss human dilemmas related to public issues. (A detailed report on the nature, development, and evaluation of the Harvard…

  6. A re-evaluation of a case-control model with contaminated controls for resource selection studies

    Treesearch

    Christopher T. Rota; Joshua J. Millspaugh; Dylan C. Kesler; Chad P. Lehman; Mark A. Rumble; Catherine M. B. Jachowski

    2013-01-01

    A common sampling design in resource selection studies involves measuring resource attributes at sample units used by an animal and at sample units considered available for use. Few models can estimate the absolute probability of using a sample unit from such data, but such approaches are generally preferred over statistical methods that estimate a relative probability...

  7. To Flip or Not to Flip? Analysis of a Flipped Classroom Pedagogy in a General Biology Course

    ERIC Educational Resources Information Center

    Heyborne, William H.; Perrett, Jamis J.

    2016-01-01

    In an attempt to better understand the flipped technique and evaluate its purported superiority in terms of student learning gains, the authors conducted an experiment comparing a flipped classroom to a traditional lecture classroom. Although the outcomes were mixed, regarding the superiority of either pedagogical approach, there does seem to be a…

  8. Evaluating Economic Impacts of Expanded Global Wood Energy Consumption with the USFPM/GFPM Model

    Treesearch

    Peter J. Ince; Andrew Kramp; Kenneth E. Skog

    2012-01-01

    A U.S. forest sector market module was developed within the general Global Forest Products Model. The U.S. module tracks regional timber markets, timber harvests by species group, and timber product outputs in greater detail than does the global model. This hybrid approach provides detailed regional market analysis for the United States although retaining the...

  9. Flight evaluation of LORAN-C in the State of Vermont

    NASA Technical Reports Server (NTRS)

    Mackenzie, F. D.; Lytle, C. D.

    1981-01-01

    A flight evaluation of LORAN C as a supplement to existing navigation aids for general aviation aircraft, particularly in mountainous regions of the United States and where VOR coverage is limited was conducted. Flights, initiated in the summer months, extend through four seasons and practically all weather conditions typical of northeastern U.S. operations. Assessment of all the data available indicates that LORAN C signals are suitable as a means of navigation during enroute, terminal and nonprecision approach operations and the performance exceeds the minimum accuracy criteria.

  10. Enhancing the quality and credibility of qualitative analysis.

    PubMed

    Patton, M Q

    1999-12-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems.

  11. Approach Towards an Evidence-Oriented Knowledge and Data Acquisition for the Optimization of Interdisciplinary Care in Dentistry and General Medicine.

    PubMed

    Seitz, Max W; Haux, Christian; Knaup, Petra; Schubert, Ingrid; Listl, Stefan

    2018-01-01

    Associations between dental and chronic-systemic diseases were observed frequently in medical research, however the findings of this research have so far found little relevance in everyday clinical treatment. Major problems are the assessment of evidence for correlations between such diseases and how to integrate current medical knowledge into the intersectoral care of dentists and general practitioners. On the example of dental and chronic-systemic diseases, the Dent@Prevent project develops an interdisciplinary decision support system (DSS), which provides the specialists with information relevant for the treatment of such cases. To provide the physicians with relevant medical knowledge, a mixed-methods approach is developed to acquire the knowledge in an evidence-oriented way. This procedure includes a literature review, routine data analyses, focus groups of dentists and general practitioners as well as the identification and integration of applicable guidelines and Patient Reported Measures (PRMs) into the treatment process. The developed mixed methods approach for an evidence-oriented knowledge acquisition indicates to be applicable and supportable for interdisciplinary projects. It can raise the systematic quality of the knowledge-acquisition process and can be applicable for an evidence-based system development. Further research is necessary to assess the impact on patient care and to evaluate possible applicability in other interdisciplinary areas.

  12. A Comparison of Two-Stage Approaches for Fitting Nonlinear Ordinary Differential Equation (ODE) Models with Mixed Effects

    PubMed Central

    Chow, Sy-Miin; Bendezú, Jason J.; Cole, Pamela M.; Ram, Nilam

    2016-01-01

    Several approaches currently exist for estimating the derivatives of observed data for model exploration purposes, including functional data analysis (FDA), generalized local linear approximation (GLLA), and generalized orthogonal local derivative approximation (GOLD). These derivative estimation procedures can be used in a two-stage process to fit mixed effects ordinary differential equation (ODE) models. While the performance and utility of these routines for estimating linear ODEs have been established, they have not yet been evaluated in the context of nonlinear ODEs with mixed effects. We compared properties of the GLLA and GOLD to an FDA-based two-stage approach denoted herein as functional ordinary differential equation with mixed effects (FODEmixed) in a Monte Carlo study using a nonlinear coupled oscillators model with mixed effects. Simulation results showed that overall, the FODEmixed outperformed both the GLLA and GOLD across all the embedding dimensions considered, but a novel use of a fourth-order GLLA approach combined with very high embedding dimensions yielded estimation results that almost paralleled those from the FODEmixed. We discuss the strengths and limitations of each approach and demonstrate how output from each stage of FODEmixed may be used to inform empirical modeling of young children’s self-regulation. PMID:27391255

  13. A Comparison of Two-Stage Approaches for Fitting Nonlinear Ordinary Differential Equation Models with Mixed Effects.

    PubMed

    Chow, Sy-Miin; Bendezú, Jason J; Cole, Pamela M; Ram, Nilam

    2016-01-01

    Several approaches exist for estimating the derivatives of observed data for model exploration purposes, including functional data analysis (FDA; Ramsay & Silverman, 2005 ), generalized local linear approximation (GLLA; Boker, Deboeck, Edler, & Peel, 2010 ), and generalized orthogonal local derivative approximation (GOLD; Deboeck, 2010 ). These derivative estimation procedures can be used in a two-stage process to fit mixed effects ordinary differential equation (ODE) models. While the performance and utility of these routines for estimating linear ODEs have been established, they have not yet been evaluated in the context of nonlinear ODEs with mixed effects. We compared properties of the GLLA and GOLD to an FDA-based two-stage approach denoted herein as functional ordinary differential equation with mixed effects (FODEmixed) in a Monte Carlo (MC) study using a nonlinear coupled oscillators model with mixed effects. Simulation results showed that overall, the FODEmixed outperformed both the GLLA and GOLD across all the embedding dimensions considered, but a novel use of a fourth-order GLLA approach combined with very high embedding dimensions yielded estimation results that almost paralleled those from the FODEmixed. We discuss the strengths and limitations of each approach and demonstrate how output from each stage of FODEmixed may be used to inform empirical modeling of young children's self-regulation.

  14. Selecting Indicator Portfolios for Marine Species and Food Webs: A Puget Sound Case Study

    PubMed Central

    Kershner, Jessi; Samhouri, Jameal F.; James, C. Andrew; Levin, Phillip S.

    2011-01-01

    Ecosystem-based management (EBM) has emerged as a promising approach for maintaining the benefits humans want and need from the ocean, yet concrete approaches for implementing EBM remain scarce. A key challenge lies in the development of indicators that can provide useful information on ecosystem status and trends, and assess progress towards management goals. In this paper, we describe a generalized framework for the methodical and transparent selection of ecosystem indicators. We apply the framework to the second largest estuary in the United States – Puget Sound, Washington – where one of the most advanced EBM processes is currently underway. Rather than introduce a new method, this paper integrates a variety of familiar approaches into one step-by-step approach that will lead to more consistent and reliable reporting on ecosystem condition. Importantly, we demonstrate how a framework linking indicators to policy goals, as well as a clearly defined indicator evaluation and scoring process, can result in a portfolio of useful and complementary indicators based on the needs of different users (e.g., policy makers and scientists). Although the set of indicators described in this paper is specific to marine species and food webs, we provide a general approach that could be applied to any set of management objectives or ecological system. PMID:21991305

  15. Improving the accuracy of Density Functional Theory (DFT) calculation for homolysis bond dissociation energies of Y-NO bond: generalized regression neural network based on grey relational analysis and principal component analysis.

    PubMed

    Li, Hong Zhi; Tao, Wei; Gao, Ting; Li, Hui; Lu, Ying Hua; Su, Zhong Min

    2011-01-01

    We propose a generalized regression neural network (GRNN) approach based on grey relational analysis (GRA) and principal component analysis (PCA) (GP-GRNN) to improve the accuracy of density functional theory (DFT) calculation for homolysis bond dissociation energies (BDE) of Y-NO bond. As a demonstration, this combined quantum chemistry calculation with the GP-GRNN approach has been applied to evaluate the homolysis BDE of 92 Y-NO organic molecules. The results show that the ull-descriptor GRNN without GRA and PCA (F-GRNN) and with GRA (G-GRNN) approaches reduce the root-mean-square (RMS) of the calculated homolysis BDE of 92 organic molecules from 5.31 to 0.49 and 0.39 kcal mol(-1) for the B3LYP/6-31G (d) calculation. Then the newly developed GP-GRNN approach further reduces the RMS to 0.31 kcal mol(-1). Thus, the GP-GRNN correction on top of B3LYP/6-31G (d) can improve the accuracy of calculating the homolysis BDE in quantum chemistry and can predict homolysis BDE which cannot be obtained experimentally.

  16. Life support approaches for Mars missions

    NASA Technical Reports Server (NTRS)

    Drysdale, A. E.; Ewert, M. K.; Hanford, A. J.

    2003-01-01

    Life support approaches for Mars missions are evaluated using an equivalent system mass (ESM) approach, in which all significant costs are converted into mass units. The best approach, as defined by the lowest mission ESM, depends on several mission parameters, notably duration, environment and consequent infrastructure costs, and crew size, as well as the characteristics of the technologies which are available. Generally, for the missions under consideration, physicochemical regeneration is most cost effective. However, bioregeneration is likely to be of use for producing salad crops for any mission, for producing staple crops for medium duration missions, and for most food, air and water regeneration for long missions (durations of a decade). Potential applications of in situ resource utilization need to be considered further. c2002 Published by Elsevier Science Ltd on behalf of COSPAR.

  17. Life support approaches for Mars missions.

    PubMed

    Drysdale, A E; Ewert, M K; Hanford, A J

    2003-01-01

    Life support approaches for Mars missions are evaluated using an equivalent system mass (ESM) approach, in which all significant costs are converted into mass units. The best approach, as defined by the lowest mission ESM, depends on several mission parameters, notably duration, environment and consequent infrastructure costs, and crew size, as well as the characteristics of the technologies which are available. Generally, for the missions under consideration, physicochemical regeneration is most cost effective. However, bioregeneration is likely to be of use for producing salad crops for any mission, for producing staple crops for medium duration missions, and for most food, air and water regeneration for long missions (durations of a decade). Potential applications of in situ resource utilization need to be considered further. c2002 Published by Elsevier Science Ltd on behalf of COSPAR.

  18. SSA Sensor Calibration Best Practices

    NASA Astrophysics Data System (ADS)

    Johnson, T.

    Best practices for calibrating orbit determination sensors in general and space situational awareness (SSA) sensors in particular are presented. These practices were developed over the last ten years within AGI and most recently applied to over 70 sensors in AGI's Commercial Space Operations Center (ComSpOC) and the US Air Force Space Command (AFSPC) Space Surveillance Network (SSN) to evaluate and configure new sensors and perform on-going system calibration. They are generally applicable to any SSA sensor and leverage some unique capabilities of an SSA estimation approach using an optimal sequential filter and smoother. Real world results are presented and analyzed.

  19. Reinforcing loose foundation stones in trait-based plant ecology.

    PubMed

    Shipley, Bill; De Bello, Francesco; Cornelissen, J Hans C; Laliberté, Etienne; Laughlin, Daniel C; Reich, Peter B

    2016-04-01

    The promise of "trait-based" plant ecology is one of generalized prediction across organizational and spatial scales, independent of taxonomy. This promise is a major reason for the increased popularity of this approach. Here, we argue that some important foundational assumptions of trait-based ecology have not received sufficient empirical evaluation. We identify three such assumptions and, where possible, suggest methods of improvement: (i) traits are functional to the degree that they determine individual fitness, (ii) intraspecific variation in functional traits can be largely ignored, and (iii) functional traits show general predictive relationships to measurable environmental gradients.

  20. The results of a limited study of approaches to the design, fabrication, and testing of a dynamic model of the NASA IOC space station. Executive summary

    NASA Technical Reports Server (NTRS)

    Brooks, George W.

    1985-01-01

    The options for the design, construction, and testing of a dynamic model of the space station were evaluated. Since the definition of the space station structure is still evolving, the Initial Operating Capacity (IOC) reference configuration was used as the general guideline. The results of the studies treat: general considerations of the need for and use of a dynamic model; factors which deal with the model design and construction; and a proposed system for supporting the dynamic model in the planned Large Spacecraft Laboratory.

  1. Motivational Differences in Seeking Out Evaluative Categorization Information.

    PubMed

    Smallman, Rachel; Becker, Brittney

    2017-07-01

    Previous research shows that people draw finer evaluative distinctions when rating liked versus disliked objects (e.g., wanting a 5-point scale to evaluate liked cuisines and a 3-point scale to rate disliked cuisines). Known as the preference-categorization effect, this pattern may exist not only in how individuals form evaluative distinctions but also in how individuals seek out evaluative information. The current research presents three experiments that examine motivational differences in evaluative information seeking (rating scales and attributes). Experiment 1 found that freedom of choice (the ability to avoid undesirable stimuli) and sensitivity to punishment (as measured by the Behavior Inhibition System/Behavioral Approach System [BIS/BAS] scale) influenced preferences for desirable and undesirable evaluative information in a health-related decision. Experiment 2 examined choice optimization, finding that maximizers prefer finer evaluative information for both liked and disliked options in a consumer task. Experiment 3 found that this pattern generalizes to another type of evaluative categorization, attributes.

  2. Phase 2 STS new user development program. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Mcdowell, J. R.

    1976-01-01

    A methodology for developing new users for STS other than NASA and DoD, thereby maximizing the use of the STS system was developed. The approach to user development, reflected in the implementation plan, and attendant informational material to be used were evaluated by conducting a series of test cases with selected user organizations. These test case organizations were, in effect, used as consultants to evaluate the effectiveness, the needs, the completeness, and the adequacy of the user development approach and informational material. The selection of the test cases provided a variety of potential STS users covering industry, other government agencies, and the educational sector. The test cases covered various use areas and provided a mix of user organization types. A summary of the actual test cases conducted is given. The conduct of the test cases verified the general approach of the implementation plan, the validity of the user development strategy prepared for each test case organization and the effectiveness of the STS basic and user customized informational material.

  3. Experiences with Recruitment of Marginalized Groups in a Danish Health Promotion Program: A Document Evaluation Study.

    PubMed

    Rasmussen, Marianne; Poulsen, Eva Kanstrup; Rytter, Anne Stoffersen; Kristiansen, Tine Mechlenborg; Bak, Carsten Kronborg

    2016-01-01

    Studies have found that marginalized groups living in deprived neighborhoods are less likely to participate in health programs compared to the majority of society. This study evaluates recruitment approaches conducted during a national government-funded project in 12 deprived neighborhoods across Denmark between 2010 and 2014. The aim of this study was to understand how recruitment approaches could promote participation in health programs within deprived neighborhoods to reach marginalized groups. Documents from all 12 of the included municipalities were collected to conduct a document evaluation. The collected documents consisted of 1,500 pages of written material with 12 project descriptions, three midterm and 10 final evaluations. The collected data were analyzed through a qualitative content analysis. The results are based on the fact that only 10 municipalities have developed evaluations related to recruitment, and only three evaluations provided a description of which marginalized groups were recruited. Challenges related to recruitment consist of difficulties involving the target group, including general distrust, language barriers and a lack of ability to cope with new situations and strangers. Additional geographical challenges emerged, especially in rural areas. Positive experiences with recruitment approaches were mainly related to relationship building and trust building, especially through face-to-face contact and the project employees' presence in the neighborhood. Additionally, adjusting some of the interventions and the recruitment strategy increased participation. This study found that relation and trust between the residents and the project employees is an important factor in the recruitment of marginalized groups in deprived neighborhoods as well as adjusting the health interventions or recruitment strategy to the target groups. In future research, it is necessary to examine which recruitment approaches are effective under which circumstances to increase participation among marginalized groups.

  4. Inter-professional in-situ simulated team and resuscitation training for patient safety: Description and impact of a programmatic approach.

    PubMed

    Zimmermann, Katja; Holzinger, Iris Bachmann; Ganassi, Lorena; Esslinger, Peter; Pilgrim, Sina; Allen, Meredith; Burmester, Margarita; Stocker, Martin

    2015-10-29

    Inter-professional teamwork is key for patient safety and team training is an effective strategy to improve patient outcome. In-situ simulation is a relatively new strategy with emerging efficacy, but best practices for the design, delivery and implementation have yet to be evaluated. Our aim is to describe and evaluate the implementation of an inter-professional in-situ simulated team and resuscitation training in a teaching hospital with a programmatic approach. We designed and implemented a team and resuscitation training program according to Kern's six steps approach for curriculum development. General and specific needs assessments were conducted as independent cross-sectional surveys. Teamwork, technical skills and detection of latent safety threats were defined as specific objectives. Inter-professional in-situ simulation was used as educational strategy. The training was embedded within the workdays of participants and implemented in our highest acuity wards (emergency department, intensive care unit, intermediate care unit). Self-perceived impact and self-efficacy were sampled with an anonymous evaluation questionnaire after every simulated training session. Assessment of team performance was done with the team-based self-assessment tool TeamMonitor applying Van der Vleuten's conceptual framework of longitudinal evaluation after experienced real events. Latent safety threats were reported during training sessions and after experienced real events. The general and specific needs assessments clearly identified the problems, revealed specific training needs and assisted with stakeholder engagement. Ninety-five interdisciplinary staff members of the Children's Hospital participated in 20 in-situ simulated training sessions within 2 years. Participant feedback showed a high effect and acceptance of training with reference to self-perceived impact and self-efficacy. Thirty-five team members experiencing 8 real critical events assessed team performance with TeamMonitor. Team performance assessment with TeamMonitor was feasible and identified specific areas to target future team training sessions. Training sessions as well as experienced real events revealed important latent safety threats that directed system changes. The programmatic approach of Kern's six steps for curriculum development helped to overcome barriers of design, implementation and assessment of an in-situ team and resuscitation training program. This approach may help improve effectiveness and impact of an in-situ simulated training program.

  5. Performance evaluation of automated manufacturing systems using generalized stochastic Petri Nets. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Al-Jaar, Robert Y.; Desrochers, Alan A.

    1989-01-01

    The main objective of this research is to develop a generic modeling methodology with a flexible and modular framework to aid in the design and performance evaluation of integrated manufacturing systems using a unified model. After a thorough examination of the available modeling methods, the Petri Net approach was adopted. The concurrent and asynchronous nature of manufacturing systems are easily captured by Petri Net models. Three basic modules were developed: machine, buffer, and Decision Making Unit. The machine and buffer modules are used for modeling transfer lines and production networks. The Decision Making Unit models the functions of a computer node in a complex Decision Making Unit Architecture. The underlying model is a Generalized Stochastic Petri Net (GSPN) that can be used for performance evaluation and structural analysis. GSPN's were chosen because they help manage the complexity of modeling large manufacturing systems. There is no need to enumerate all the possible states of the Markov Chain since they are automatically generated from the GSPN model.

  6. General Analytical Schemes for the Characterization of Pectin-Based Edible Gelled Systems

    PubMed Central

    Haghighi, Maryam; Rezaei, Karamatollah

    2012-01-01

    Pectin-based gelled systems have gained increasing attention for the design of newly developed food products. For this reason, the characterization of such formulas is a necessity in order to present scientific data and to introduce an appropriate finished product to the industry. Various analytical techniques are available for the evaluation of the systems formulated on the basis of pectin and the designed gel. In this paper, general analytical approaches for the characterization of pectin-based gelled systems were categorized into several subsections including physicochemical analysis, visual observation, textural/rheological measurement, microstructural image characterization, and psychorheological evaluation. Three-dimensional trials to assess correlations among microstructure, texture, and taste were also discussed. Practical examples of advanced objective techniques including experimental setups for small and large deformation rheological measurements and microstructural image analysis were presented in more details. PMID:22645484

  7. Towards a general theory of implementation

    PubMed Central

    2013-01-01

    Understanding and evaluating the implementation of complex interventions in practice is an important problem for healthcare managers and policy makers, and for patients and others who must operationalize them beyond formal clinical settings. It has been argued that this work should be founded on theory that provides a foundation for understanding, designing, predicting, and evaluating dynamic implementation processes. This paper sets out core constituents of a general theory of implementation, building on Normalization Process Theory and linking it to key constructs from recent work in sociology and psychology. These are informed by ideas about agency and its expression within social systems and fields, social and cognitive mechanisms, and collective action. This approach unites a number of contending perspectives in a way that makes possible a more comprehensive explanation of the implementation and embedding of new ways of thinking, enacting and organizing practice. PMID:23406398

  8. Towards a general theory of implementation.

    PubMed

    May, Carl

    2013-02-13

    Understanding and evaluating the implementation of complex interventions in practice is an important problem for healthcare managers and policy makers, and for patients and others who must operationalize them beyond formal clinical settings. It has been argued that this work should be founded on theory that provides a foundation for understanding, designing, predicting, and evaluating dynamic implementation processes. This paper sets out core constituents of a general theory of implementation, building on Normalization Process Theory and linking it to key constructs from recent work in sociology and psychology. These are informed by ideas about agency and its expression within social systems and fields, social and cognitive mechanisms, and collective action. This approach unites a number of contending perspectives in a way that makes possible a more comprehensive explanation of the implementation and embedding of new ways of thinking, enacting and organizing practice.

  9. Surveillance Monitoring Management for General Care Units: Strategy, Design, and Implementation.

    PubMed

    McGrath, Susan P; Taenzer, Andreas H; Karon, Nancy; Blike, George

    2016-07-01

    The growing number of monitoring devices, combined with suboptimal patient monitoring and alarm management strategies, has increased "alarm fatigue," which have led to serious consequences. Most reported alarm man- agement approaches have focused on the critical care setting. Since 2007 Dartmouth-Hitchcock (Lebanon, New Hamp- shire) has developed a generalizable and effective design, implementation, and performance evaluation approach to alarm systems for continuous monitoring in general care settings (that is, patient surveillance monitoring). In late 2007, a patient surveillance monitoring system was piloted on the basis of a structured design and implementation approach in a 36-bed orthopedics unit. Beginning in early 2009, it was expanded to cover more than 200 inpatient beds in all medicine and surgical units, except for psychiatry and labor and delivery. Improvements in clinical outcomes (reduction of unplanned transfers by 50% and reduction of rescue events by more than 60% in 2008) and approximately two alarms per patient per 12-hour nursing shift in the original pilot unit have been sustained across most D-H general care units in spite of increasing patient acuity and unit occupancy. Sample analysis of pager notifications indicates that more than 85% of all alarm conditions are resolved within 30 seconds and that more than 99% are resolved before escalation is triggered. The D-H surveillance monitoring system employs several important, generalizable features to manage alarms in a general care setting: alarm delays, static thresholds set appropriately for the prevalence of events in this setting, directed alarm annunciation, and policy-driven customization of thresholds to allow clinicians to respond to needs of individual patients. The systematic approach to design, implementation, and performance management has been key to the success of the system.

  10. Compatible Models of Carbon Content of Individual Trees on a Cunninghamia lanceolata Plantation in Fujian Province, China

    PubMed Central

    Zhuo, Lin; Tao, Hong; Wei, Hong; Chengzhen, Wu

    2016-01-01

    We tried to establish compatible carbon content models of individual trees for a Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) plantation from Fujian province in southeast China. In general, compatibility requires that the sum of components equal the whole tree, meaning that the sum of percentages calculated from component equations should equal 100%. Thus, we used multiple approaches to simulate carbon content in boles, branches, foliage leaves, roots and the whole individual trees. The approaches included (i) single optimal fitting (SOF), (ii) nonlinear adjustment in proportion (NAP) and (iii) nonlinear seemingly unrelated regression (NSUR). These approaches were used in combination with variables relating diameter at breast height (D) and tree height (H), such as D, D2H, DH and D&H (where D&H means two separate variables in bivariate model). Power, exponential and polynomial functions were tested as well as a new general function model was proposed by this study. Weighted least squares regression models were employed to eliminate heteroscedasticity. Model performances were evaluated by using mean residuals, residual variance, mean square error and the determination coefficient. The results indicated that models with two dimensional variables (DH, D2H and D&H) were always superior to those with a single variable (D). The D&H variable combination was found to be the most useful predictor. Of all the approaches, SOF could establish a single optimal model separately, but there were deviations in estimating results due to existing incompatibilities, while NAP and NSUR could ensure predictions compatibility. Simultaneously, we found that the new general model had better accuracy than others. In conclusion, we recommend that the new general model be used to estimate carbon content for Chinese fir and considered for other vegetation types as well. PMID:26982054

  11. Effectiveness evaluation of objective and subjective weighting methods for aquifer vulnerability assessment in urban context

    NASA Astrophysics Data System (ADS)

    Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet

    2016-10-01

    Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.

  12. A Methodology for Evaluating and Ranking Water Quantity Indicators in Support of Ecosystem-Based Management

    NASA Astrophysics Data System (ADS)

    James, C. Andrew; Kershner, Jessi; Samhouri, Jameal; O'Neill, Sandra; Levin, Phillip S.

    2012-03-01

    Ecosystem-based Management (EBM) is an approach that includes different management priorities and requires a balance between anthropogenic and ecological resource demands. Indicators can be used to monitor ecosystem status and trends, and assess whether projects and/or programs are leading to the achievement of management goals. As such, the careful selection of a suite of indicators is a crucial exercise. In this paper we describe an indicator evaluation and selection process designed to support the EBM approach in Puget Sound. The first step in this process was the development of a general framework for selecting indicators. The framework, designed to transparently include both scientific and policy considerations into the selection and evaluation process, was developed and then utilized in the organization and determination of a preliminary set of indicators. Next, the indicators were assessed against a set of nineteen distinct criteria that describe the model characteristics of an indicator. A literature review was performed for each indicator to determine the extent to which it satisfied each of the evaluation criteria. The result of each literature review was summarized in a numerical matrix, allowing comparison, and demonstrating the extent of scientific reliability. Finally, an approach for ranking indicators was developed to explore the effects of intended purpose on indicator selection. We identified several sets of scientifically valid and policy-relevant indicators that included metrics such as annual-7 day low flow and water system reliability, which are supportive of the EBM approach in the Puget Sound.

  13. Using Open Geographic Data to Generate Natural Language Descriptions for Hydrological Sensor Networks.

    PubMed

    Molina, Martin; Sanchez-Soriano, Javier; Corcho, Oscar

    2015-07-03

    Providing descriptions of isolated sensors and sensor networks in natural language, understandable by the general public, is useful to help users find relevant sensors and analyze sensor data. In this paper, we discuss the feasibility of using geographic knowledge from public databases available on the Web (such as OpenStreetMap, Geonames, or DBpedia) to automatically construct such descriptions. We present a general method that uses such information to generate sensor descriptions in natural language. The results of the evaluation of our method in a hydrologic national sensor network showed that this approach is feasible and capable of generating adequate sensor descriptions with a lower development effort compared to other approaches. In the paper we also analyze certain problems that we found in public databases (e.g., heterogeneity, non-standard use of labels, or rigid search methods) and their impact in the generation of sensor descriptions.

  14. The discipline of ergonomics in Cuba within the occupational health framework: background and trends.

    PubMed

    Torres, Yaniel; Rodríguez, Yordán; Viña, Silvio

    2013-01-01

    The concept of ergonomics was introduced in Cuba at the beginning of the 1970s. More than 40 years later, the prevailing approach to workers' health is still generally reactive rather than proactive, despite the commitment of the government to the subject. A factor influencing this issue is, generally, lack of recognition of the benefits of establishing ergonomic principles within most occupational activities. Recent progress to move occupational health practice toward a more preventive approach has been conducted, frequently with international support. The introduction of a set of Cuban standards proposing the necessity of ergonomic evaluations is an example of this progress. The main challenge for Cuban ergonomists is to transfer knowledge to occupational health practitioners in order to be in concordance with basic standards and regulations regarding ergonomics. The article offers a short description of the history of ergonomics and an overview of ergonomics practice in Cuba.

  15. Blood Tests for People with Severe Learning Disabilities Receiving Dental Treatment under General Anaesthesia.

    PubMed

    Clough, Stacey; Shehabi, Zahra; Morgan, Claire; Sheppey, Claire

    2016-11-01

    People with learning disabilities (LDs) have poorer health than their non-disabled peers due to failures in reasonable adjustments. One hundred patients with severe LD and challenging behaviour attended for dental treatment under GA, during which routine blood testing was provided. Communication with general medical practitioners (GMPs) and blood test results were evaluated, showing poor communication with GMPs and significant undiagnosed disease among this group. Blood tests generate similar costs in primary and secondary care but a holistic approach to care under GA reduces expenses brought by lost clinical time and resources due to complex behaviours in an out-patient setting. Clinical relevance: This article discusses a holistic approach to healthcare for people with severe LD, including patient outcomes, financial and resource implications, and offers practical guidance on venepuncture technique, which is relevant to many aspects of both community and hospital dental practice.

  16. Remote sensing-aided systems for snow qualification, evapotranspiration estimation, and their application in hydrologic models

    NASA Technical Reports Server (NTRS)

    Korram, S.

    1977-01-01

    The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.

  17. Using Open Geographic Data to Generate Natural Language Descriptions for Hydrological Sensor Networks

    PubMed Central

    Molina, Martin; Sanchez-Soriano, Javier; Corcho, Oscar

    2015-01-01

    Providing descriptions of isolated sensors and sensor networks in natural language, understandable by the general public, is useful to help users find relevant sensors and analyze sensor data. In this paper, we discuss the feasibility of using geographic knowledge from public databases available on the Web (such as OpenStreetMap, Geonames, or DBpedia) to automatically construct such descriptions. We present a general method that uses such information to generate sensor descriptions in natural language. The results of the evaluation of our method in a hydrologic national sensor network showed that this approach is feasible and capable of generating adequate sensor descriptions with a lower development effort compared to other approaches. In the paper we also analyze certain problems that we found in public databases (e.g., heterogeneity, non-standard use of labels, or rigid search methods) and their impact in the generation of sensor descriptions. PMID:26151211

  18. Artificial General Intelligence: Concept, State of the Art, and Future Prospects

    NASA Astrophysics Data System (ADS)

    Goertzel, Ben

    2014-12-01

    In recent years broad community of researchers has emerged, focusing on the original ambitious goals of the AI field - the creation and study of software or hardware systems with general intelligence comparable to, and ultimately perhaps greater than, that of human beings. This paper surveys this diverse community and its progress. Approaches to defining the concept of Artificial General Intelligence (AGI) are reviewed including mathematical formalisms, engineering, and biology inspired perspectives. The spectrum of designs for AGI systems includes systems with symbolic, emergentist, hybrid and universalist characteristics. Metrics for general intelligence are evaluated, with a conclusion that, although metrics for assessing the achievement of human-level AGI may be relatively straightforward (e.g. the Turing Test, or a robot that can graduate from elementary school or university), metrics for assessing partial progress remain more controversial and problematic.

  19. Feature extraction with deep neural networks by a generalized discriminant analysis.

    PubMed

    Stuhlsatz, André; Lippel, Jens; Zielke, Thomas

    2012-04-01

    We present an approach to feature extraction that is a generalization of the classical linear discriminant analysis (LDA) on the basis of deep neural networks (DNNs). As for LDA, discriminative features generated from independent Gaussian class conditionals are assumed. This modeling has the advantages that the intrinsic dimensionality of the feature space is bounded by the number of classes and that the optimal discriminant function is linear. Unfortunately, linear transformations are insufficient to extract optimal discriminative features from arbitrarily distributed raw measurements. The generalized discriminant analysis (GerDA) proposed in this paper uses nonlinear transformations that are learnt by DNNs in a semisupervised fashion. We show that the feature extraction based on our approach displays excellent performance on real-world recognition and detection tasks, such as handwritten digit recognition and face detection. In a series of experiments, we evaluate GerDA features with respect to dimensionality reduction, visualization, classification, and detection. Moreover, we show that GerDA DNNs can preprocess truly high-dimensional input data to low-dimensional representations that facilitate accurate predictions even if simple linear predictors or measures of similarity are used.

  20. A score-statistic approach for determining threshold values in QTL mapping.

    PubMed

    Kao, Chen-Hung; Ho, Hsiang-An

    2012-06-01

    Issues in determining the threshold values of QTL mapping are often investigated for the backcross and F2 populations with relatively simple genome structures so far. The investigations of these issues in the progeny populations after F2 (advanced populations) with relatively more complicated genomes are generally inadequate. As these advanced populations have been well implemented in QTL mapping, it is important to address these issues for them in more details. Due to an increasing number of meiosis cycle, the genomes of the advanced populations can be very different from the backcross and F2 genomes. Therefore, special devices that consider the specific genome structures present in the advanced populations are required to resolve these issues. By considering the differences in genome structure between populations, we formulate more general score test statistics and gaussian processes to evaluate their threshold values. In general, we found that, given a significance level and a genome size, threshold values for QTL detection are higher in the denser marker maps and in the more advanced populations. Simulations were performed to validate our approach.

  1. Regional frequency analysis of extreme rainfalls using partial L moments method

    NASA Astrophysics Data System (ADS)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  2. Applying TOGAF for e-government implementation based on service oriented architecture methodology towards good government governance

    NASA Astrophysics Data System (ADS)

    Hodijah, A.; Sundari, S.; Nugraha, A. C.

    2018-05-01

    As a Local Government Agencies who perform public services, General Government Office already has utilized Reporting Information System of Local Government Implementation (E-LPPD). However, E-LPPD has upgrade limitation for the integration processes that cannot accommodate General Government Offices’ needs in order to achieve Good Government Governance (GGG), while success stories of the ultimate goal of e-government implementation requires good governance practices. Currently, citizen demand public services as private sector do, which needs service innovation by utilizing the legacy system as a service based e-government implementation, while Service Oriented Architecture (SOA) to redefine a business processes as a set of IT enabled services and Enterprise Architecture from the Open Group Architecture Framework (TOGAF) as a comprehensive approach in redefining business processes as service innovation towards GGG. This paper takes a case study on Performance Evaluation of Local Government Implementation (EKPPD) system on General Government Office. The results show that TOGAF will guide the development of integrated business processes of EKPPD system that fits good governance practices to attain GGG with SOA methodology as technical approach.

  3. A General Model for Performance Evaluation in DS-CDMA Systems with Variable Spreading Factors

    NASA Astrophysics Data System (ADS)

    Chiaraluce, Franco; Gambi, Ennio; Righi, Giorgia

    This paper extends previous analytical approaches for the study of CDMA systems to the relevant case of multipath environments where users can operate at different bit rates. This scenario is of interest for the Wideband CDMA strategy employed in UMTS, and the model permits the performance comparison of classic and more innovative spreading signals. The method is based on the characteristic function approach, that allows to model accurately the various kinds of interferences. Some numerical examples are given with reference to the ITU-R M. 1225 Recommendations, but the analysis could be extended to different channel descriptions.

  4. Message Bus Architectures - Simplicity in the Right Places

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2010-01-01

    There will always be a new latest and greatest architecture for satellite ground systems. This paper discusses the use of a proven message-oriented middleware (MOM) architecture using publish/subscribe functions and the strengths it brings to these mission critical systems. An even newer approach gaining popularity is Service Oriented Architectures (SOAs). SOAs are generally considered more powerful than the MOM approach and address many mission-critical system challenges. A MOM vs SOA discussion can highlight capabilities supported or enabled by the underlying architecture and can identify benefits of MOMs and SOAs when applied to differing sets of mission requirements or evaluation criteria.

  5. Validation of a plant-wide phosphorus modelling approach with minerals precipitation in a full-scale WWTP.

    PubMed

    Kazadi Mbamba, Christian; Flores-Alsina, Xavier; John Batstone, Damien; Tait, Stephan

    2016-09-01

    The focus of modelling in wastewater treatment is shifting from single unit to plant-wide scale. Plant-wide modelling approaches provide opportunities to study the dynamics and interactions of different transformations in water and sludge streams. Towards developing more general and robust simulation tools applicable to a broad range of wastewater engineering problems, this paper evaluates a plant-wide model built with sub-models from the Benchmark Simulation Model No. 2-P (BSM2-P) with an improved/expanded physico-chemical framework (PCF). The PCF includes a simple and validated equilibrium approach describing ion speciation and ion pairing with kinetic multiple minerals precipitation. Model performance is evaluated against data sets from a full-scale wastewater treatment plant, assessing capability to describe water and sludge lines across the treatment process under steady-state operation. With default rate kinetic and stoichiometric parameters, a good general agreement is observed between the full-scale datasets and the simulated results under steady-state conditions. Simulation results show differences between measured and modelled phosphorus as little as 4-15% (relative) throughout the entire plant. Dynamic influent profiles were generated using a calibrated influent generator and were used to study the effect of long-term influent dynamics on plant performance. Model-based analysis shows that minerals precipitation strongly influences composition in the anaerobic digesters, but also impacts on nutrient loading across the entire plant. A forecasted implementation of nutrient recovery by struvite crystallization (model scenario only), reduced the phosphorus content in the treatment plant influent (via centrate recycling) considerably and thus decreased phosphorus in the treated outflow by up to 43%. Overall, the evaluated plant-wide model is able to jointly describe the physico-chemical and biological processes, and is advocated for future use as a tool for design, performance evaluation and optimization of whole wastewater treatment plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A unified procedure for meta-analytic evaluation of surrogate end points in randomized clinical trials

    PubMed Central

    Dai, James Y.; Hughes, James P.

    2012-01-01

    The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448

  7. Evaluating Sleep Disturbance: A Review of Methods

    NASA Technical Reports Server (NTRS)

    Smith, Roy M.; Oyung, R.; Gregory, K.; Miller, D.; Rosekind, M.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    There are three general approaches to evaluating sleep disturbance in regards to noise: subjective, behavioral, and physiological. Subjective methods range from standardized questionnaires and scales to self-report measures designed for specific research questions. There are two behavioral methods that provide useful sleep disturbance data. One behavioral method is actigraphy, a motion detector that provides an empirical estimate of sleep quantity and quality. An actigraph, worn on the non-dominant wrist, provides a 24-hr estimate of the rest/activity cycle. The other method involves a behavioral response, either to a specific probe or stimuli or subject initiated (e.g., indicating wakefulness). The classic, gold standard for evaluating sleep disturbance is continuous physiological monitoring of brain, eye, and muscle activity. This allows detailed distinctions of the states and stages of sleep, awakenings, and sleep continuity. Physiological delta can be obtained in controlled laboratory settings and in natural environments. Current ambulatory physiological recording equipment allows evaluation in home and work settings. These approaches will be described and the relative strengths and limitations of each method will be discussed.

  8. Surgery via natural orifices in human beings: yesterday, today, tomorrow.

    PubMed

    Moris, Demetrios N; Bramis, Konstantinos J; Mantonakis, Eleftherios I; Papalampros, Efstathios L; Petrou, Athanasios S; Papalampros, Alexandros E

    2012-07-01

    We performed an evaluation of models, techniques, and applicability to the clinical setting of natural orifice surgery (mainly natural orifice transluminal endoscopic surgery [NOTES]) primarily in general surgery procedures. NOTES has attracted much attention recently for its potential to establish a completely alternative approach to the traditional surgical procedures performed entirely through a natural orifice. Beyond the potentially scar-free surgery and abolishment of dermal incision-related complications, the safety and efficacy of this new surgical technology must be evaluated. Studies were identified by searching MEDLINE, EMBASE, Cochrane Library, and Entrez PubMed from 2007 to February 2011. Most of the references were identified from 2009 to 2010. There were limitations as far as the population that was evaluated (only human beings, no cadavers or animals) was concerned, but there were no limitations concerning the level of evidence of the studies that were evaluated. The studies that were deemed applicable for our review were published mainly from 2007 to 2010 (see Methods section). All the evaluated studies were conducted only in human beings. We studied the most common referred in the literature orifices such as vaginal, oral, gastric, esophageal, anal, or urethral. The optimal access route and method could not be established because of the different nature of each procedure. We mainly studied procedures in the field of general surgery such as cholecystectomy, intestinal cancers, renal cancers, appendectomy, mediastinoscopy, and peritoneoscopy. All procedures were feasible and most of them had an uneventful postoperative course. A number of technical problems were encountered, especially as far as pure NOTES procedures are concerned, which makes the need of developing new endoscopic instruments, to facilitate each approach, undeniable. NOTES is still in the early stages of development and more robust technologies will be needed to achieve reliable closure and overcome technical challenges. Well-designed studies in human beings need to be conducted to determine the safety and efficacy of NOTES in a clinical setting. Among these NOTES approaches, the transvaginal route seems less complicated because it virtually eliminates concerns for leakage and fistulas. The transvaginal approach further favors upper-abdominal surgeries because it provides better maneuverability to upper-abdominal organs (eg, liver, gallbladder, spleen, abdominal esophagus, and stomach). The stomach is considered one of the most promising targets because this large organ, once adequately mobilized, can be transected easily with a stapler. The majority of the approaches seem to be feasible even with the equipment used nowadays, but to achieve better results and wider applications to human beings, the need to develop new endoscopic instruments to facilitate each approach is necessary. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Algorithm for evaluating the effectiveness of a high-rise development project based on current yield

    NASA Astrophysics Data System (ADS)

    Soboleva, Elena

    2018-03-01

    The article is aimed at the issues of operational evaluation of development project efficiency in high-rise construction under the current economic conditions in Russia. The author touches the following issues: problems of implementing development projects, the influence of the operational evaluation quality of high-rise construction projects on general efficiency, assessing the influence of the project's external environment on the effectiveness of project activities under crisis conditions and the quality of project management. The article proposes the algorithm and the methodological approach to the quality management of the developer project efficiency based on operational evaluation of the current yield efficiency. The methodology for calculating the current efficiency of a development project for high-rise construction has been updated.

  10. Evaluation of Genetic Algorithm Concepts Using Model Problems. Part 2; Multi-Objective Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2003-01-01

    A genetic algorithm approach suitable for solving multi-objective optimization problems is described and evaluated using a series of simple model problems. Several new features including a binning selection algorithm and a gene-space transformation procedure are included. The genetic algorithm is suitable for finding pareto optimal solutions in search spaces that are defined by any number of genes and that contain any number of local extrema. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all optimization problems attempted. The binning algorithm generally provides pareto front quality enhancements and moderate convergence efficiency improvements for most of the model problems. The gene-space transformation procedure provides a large convergence efficiency enhancement for problems with non-convoluted pareto fronts and a degradation in efficiency for problems with convoluted pareto fronts. The most difficult problems --multi-mode search spaces with a large number of genes and convoluted pareto fronts-- require a large number of function evaluations for GA convergence, but always converge.

  11. Space shuttle post-entry and landing analysis. Volume 1: Candidate system evaluations

    NASA Technical Reports Server (NTRS)

    Crawford, B. S.; Duiven, E. M.

    1973-01-01

    The general purpose of this study is to aid in the evaluation and design of multi-sensor navigation schemes proposed for the orbiter. The scope of the effort is limited to the post-entry, energy management, and approach and landing mission phases. One candidate system based on conventional navigation aids is illustrated including two DME (Distance Measuring Equipment) stations and ILS (Instrument Landing System) glide slope and localizer antennas. Some key elements of the system not shown are the onboard IMUs (Inertial Measurement Units), altimeters, and a computer. The latter is programmed to mix together (filter) the IMU data and the externally-derived data. A completely automatic, all-weather landing capability is required. Since no air-breathing engines will be carried on orbital flights, there will be no chance to go around and try again following a missed approach.

  12. Force and Moment Approach for Achievable Dynamics Using Nonlinear Dynamic Inversion

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Bacon, Barton J.

    1999-01-01

    This paper describes a general form of nonlinear dynamic inversion control for use in a generic nonlinear simulation to evaluate candidate augmented aircraft dynamics. The implementation is specifically tailored to the task of quickly assessing an aircraft's control power requirements and defining the achievable dynamic set. The achievable set is evaluated while undergoing complex mission maneuvers, and perfect tracking will be accomplished when the desired dynamics are achievable. Variables are extracted directly from the simulation model each iteration, so robustness is not an issue. Included in this paper is a description of the implementation of the forces and moments from simulation variables, the calculation of control effectiveness coefficients, methods for implementing different types of aerodynamic and thrust vectoring controls, adjustments for control effector failures, and the allocation approach used. A few examples illustrate the perfect tracking results obtained.

  13. Industrial applications study. Volume V. Bibliography of relevant literature. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Harry L.; Hamel, Bernard B.; Karamchetty, Som

    1976-12-01

    This five-volume report represents an initial Phase O evaluation of waste heat recovery and utilization potential in the manufacturing portion of the industrial sector. The scope of this initial phase was limited to the two-digit SIC level and addressed the feasibility of obtaining in-depth energy information in the industrial sector. Within this phase, a successful methodology and approaches for data gathering and assessment are established. Using these approaches, energy use and waste heat profiles were developed at the 2-digit level; with this data, waste heat utilization technologies were evaluated. The first section of the bibliography lists extensive citations for allmore » industries. The next section is composed of an extensive literature search with abstracts for industrial energy conservation. EPA publications on specific industries and general references conclude the publication. (MCW)« less

  14. Flight Experiment Investigation of General Aviation Self-Separation and Sequencing Tasks

    NASA Technical Reports Server (NTRS)

    Murdoch, Jennifer L.; Ramiscal, Ermin R.; McNabb, Jennifer L.; Bussink, Frank J. L.

    2005-01-01

    A new flight operations concept called Small Aircraft Transportation System (SATS) Higher Volume Operations (HVO) was developed to increase capacity during Instrument Meteorological Conditions (IMC) at non-towered, non-radar airports by enabling concurrent operations of multiple aircraft. One aspect of this concept involves having pilots safely self-separate from other aircraft during approaches into these airports using appropriate SATS HVO procedures. A flight experiment was conducted to determine if instrument-rated general aviation (GA) pilots could self-separate and sequence their ownship aircraft, while following a simulated aircraft, into a non-towered, non-radar airport during simulated IMC. Six GA pilots' workload levels and abilities to perform self-separation and sequencing procedures while flying a global positioning system (GPS) instrument approach procedure were examined. The results showed that the evaluation pilots maintained at least the minimum specified separation between their ownship aircraft and simulated traffic and maintained their assigned landing sequence 100-percent of the time. Neither flight path deviations nor subjective workload assessments were negatively impacted by the additional tasks of self-separating and sequencing during these instrument approaches.

  15. Pattern recognition tool based on complex network-based approach

    NASA Astrophysics Data System (ADS)

    Casanova, Dalcimar; Backes, André Ricardo; Martinez Bruno, Odemir

    2013-02-01

    This work proposed a generalization of the method proposed by the authors: 'A complex network-based approach for boundary shape analysis'. Instead of modelling a contour into a graph and use complex networks rules to characterize it, here, we generalize the technique. This way, the work proposes a mathematical tool for characterization signals, curves and set of points. To evaluate the pattern description power of the proposal, an experiment of plat identification based on leaf veins image are conducted. Leaf vein is a taxon characteristic used to plant identification proposes, and one of its characteristics is that these structures are complex, and difficult to be represented as a signal or curves and this way to be analyzed in a classical pattern recognition approach. Here, we model the veins as a set of points and model as graphs. As features, we use the degree and joint degree measurements in a dynamic evolution. The results demonstrates that the technique has a good power of discrimination and can be used for plant identification, as well as other complex pattern recognition tasks.

  16. Reliability and Validity in Hospital Case-Mix Measurement

    PubMed Central

    Pettengill, Julian; Vertrees, James

    1982-01-01

    There is widespread interest in the development of a measure of hospital output. This paper describes the problem of measuring the expected cost of the mix of inpatient cases treated in a hospital (hospital case-mix) and a general approach to its solution. The solution is based on a set of homogenous groups of patients, defined by a patient classification system, and a set of estimated relative cost weights corresponding to the patient categories. This approach is applied to develop a summary measure of the expected relative costliness of the mix of Medicare patients treated in 5,576 participating hospitals. The Medicare case-mix index is evaluated by estimating a hospital average cost function. This provides a direct test of the hypothesis that the relationship between Medicare case-mix and Medicare cost per case is proportional. The cost function analysis also provides a means of simulating the effects of classification error on our estimate of this relationship. Our results indicate that this general approach to measuring hospital case-mix provides a valid and robust measure of the expected cost of a hospital's case-mix. PMID:10309909

  17. Cancer risk assessment of polycyclic aromatic hydrocarbon contaminated soils determined using bioassay-derived levels of benzo[a]pyrene equivalents.

    PubMed

    Lemieux, Christine L; Long, Alexandra S; Lambert, Iain B; Lundstedt, Staffan; Tysklind, Mats; White, Paul A

    2015-02-03

    Here we evaluate the excess lifetime cancer risk (ELCR) posed by 10 PAH-contaminated soils using (i) the currently advocated, targeted chemical-specific approach that assumes dose additivity for carcinogenic PAHs and (ii) a bioassay-based approach that employs the in vitro mutagenic activity of the soil fractions to determine levels of benzo[a]pyrene equivalents and, by extension, ELCR. Mutagenic activity results are presented in our companion paper.1 The results show that ELCR values for the PAH-containing fractions, determined using the chemical-specific approach, are generally (i.e., 8 out of 10) greater than those calculated using the bioassay-based approach; most are less than 5-fold greater. Only two chemical-specific ELCR estimates are less than their corresponding bioassay-derived values; differences are less than 10%. The bioassay-based approach, which permits estimation of ELCR without a priori knowledge of mixture composition, proved to be a useful tool to evaluate the chemical-specific approach. The results suggest that ELCR estimates for complex PAH mixtures determined using a targeted, chemical-specific approach are reasonable, albeit conservative. Calculated risk estimates still depend on contentious PEFs and cancer slope factors. Follow-up in vivo mutagenicity assessments will be required to validate the results and their relevance for human health risk assessment of PAH-contaminated soils.

  18. "iBIM"--internet-based interactive modules: an easy and interesting learning tool for general surgery residents.

    PubMed

    Azer, Nader; Shi, Xinzhe; de Gara, Chris; Karmali, Shahzeer; Birch, Daniel W

    2014-04-01

    The increased use of information technology supports a resident- centred educational approach that promotes autonomy, flexibility and time management and helps residents to assess their competence, promoting self-awareness. We established a web-based e-learning tool to introduce general surgery residents to bariatric surgery and evaluate them to determine the most appropriate implementation strategy for Internet-based interactive modules (iBIM) in surgical teaching. Usernames and passwords were assigned to general surgery residents at the University of Alberta. They were directed to the Obesity101 website and prompted to complete a multiple-choice precourse test. Afterwards, they were able to access the interactive modules. Residents could review the course material as often as they wanted before completing a multiple-choice postcourse test and exit survey. We used paired t tests to assess the difference between pre- and postcourse scores. Out of 34 residents who agreed to participate in the project, 12 completed the project (35.3%). For these 12 residents, the precourse mean score was 50 ± 17.3 and the postcourse mean score was 67 ± 14 (p = 0.020). Most residents who participated in this study recommended using the iBIMs as a study tool for bariatric surgery. Course evaluation scores suggest this novel approach was successful in transferring knowledge to surgical trainees. Further development of this tool and assessment of implementation strategies will determine how iBIM in bariatric surgery may be integrated into the curriculum.

  19. A predictive modeling approach to increasing the economic effectiveness of disease management programs.

    PubMed

    Bayerstadler, Andreas; Benstetter, Franz; Heumann, Christian; Winter, Fabian

    2014-09-01

    Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-) effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.

  20. Assessing college students' attitudes toward responsible drinking messages to identify promising binge drinking intervention strategies.

    PubMed

    Pilling, Valerie K; Brannon, Laura A

    2007-01-01

    Health communication appeals were utilized through a Web site simulation to evaluate the potential effectiveness of 3 intervention approaches to promote responsible drinking among college students. Within the Web site simulation, participants were exposed to a persuasive message designed to represent either the generalized social norms advertising approach (based on others' behavior), the personalized behavioral feedback approach (tailored to the individual's behavior), or the schema-based approach (tailored to the individual's self-schema, or personality). A control group was exposed to a message that was designed to be neutral (it was designed to discourage heavy drinking, but it did not represent any of the previously mentioned approaches). It was hypothesized that the more personalized the message was to the individual, the more favorable college students' attitudes would be toward the responsible drinking message. Participants receiving the more personalized messages did report more favorable attitudes toward the responsible drinking message.

  1. DNA enrichment approaches to identify unauthorized genetically modified organisms (GMOs).

    PubMed

    Arulandhu, Alfred J; van Dijk, Jeroen P; Dobnik, David; Holst-Jensen, Arne; Shi, Jianxin; Zel, Jana; Kok, Esther J

    2016-07-01

    With the increased global production of different genetically modified (GM) plant varieties, chances increase that unauthorized GM organisms (UGMOs) may enter the food chain. At the same time, the detection of UGMOs is a challenging task because of the limited sequence information that will generally be available. PCR-based methods are available to detect and quantify known UGMOs in specific cases. If this approach is not feasible, DNA enrichment of the unknown adjacent sequences of known GMO elements is one way to detect the presence of UGMOs in a food or feed product. These enrichment approaches are also known as chromosome walking or gene walking (GW). In recent years, enrichment approaches have been coupled with next generation sequencing (NGS) analysis and implemented in, amongst others, the medical and microbiological fields. The present review will provide an overview of these approaches and an evaluation of their applicability in the identification of UGMOs in complex food or feed samples.

  2. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  3. Design of a general methodology for the evaluation and categorization of an environmental program with special reference to Costa Rica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castillo, H.

    1982-01-01

    The Government of Costa Rica has stated the need for a formal procedure for the evaluation and categorization of an environmental program. Methodological studies were prepared as the basis for the development of the general methodology by which each government or institution can adapt and implement the procedure. The methodology was established by using different techniques according to their contribution to the evaluation process, such as: Systemic Approach, Delphi, and Saaty Methods. The methodology consists of two main parts: 1) evaluation of the environmental aspects by using different techniques; 2) categorization of the environmental aspects by applying the methodology tomore » the Costa Rican Environmental affairs using questionnaire answers supplied by experts both inside and outside of the country. The second part of the research includes Appendixes in which is presented general information concerning institutions related to environmental affairs; description of the methods used; results of the current status evaluation and its scale; the final scale of categorization; and the questionnaires and a list of experts. The methodology developed in this research will have a beneficial impact on environmental concerns in Costa Rica. As a result of this research, a Commission Office of Environmental Affairs, providing links between consumers, engineers, scientists, and the Government, is recommended. Also there is significant potential use of this methodology in developed countries for a better balancing of the budgets of major research programs such as cancer, heart, and other research areas.« less

  4. Performance and Perception in the Flipped Learning Model: An Initial Approach to Evaluate the Effectiveness of a New Teaching Methodology in a General Science Classroom

    ERIC Educational Resources Information Center

    González-Gómez, David; Jeong, Jin Su; Airado Rodríguez, Diego; Cañada-Cañada, Florentina

    2016-01-01

    "Flipped classroom" teaching methodology is a type of blended learning in which the traditional class setting is inverted. Lecture is shifted outside of class, while the classroom time is employed to solve problems or doing practical works through the discussion/peer collaboration of students and instructors. This relatively new…

  5. Training Adults and Children with an Autism Spectrum Disorder to Be Compliant with a Clinical Dental Assessment Using a TEACCH-Based Approach

    ERIC Educational Resources Information Center

    Orellana, Lorena M.; Martínez-Sanchis, Sonia; Silvestre, Francisco J.

    2014-01-01

    The specific neuropsychological and sensory profile found in persons with autism spectrum disorders complicate dental procedures and as a result of this, most are treated under general anesthesia or unnecessary sedation. The main goal of the present study was to evaluate the effectiveness of a short treatment and education of autistic and related…

  6. A System Approach to Navy Medical Education and Training. Appendix 29. Competency Curriculum for Advanced General Duty Corpsman.

    DTIC Science & Technology

    1974-08-31

    c. Perform qualitative tests for fecal fat , bilirubin, urobilirubin and starch granules by staining methods PERFORMANCE OBJECTIVE f (Stimulus) Upon...myocardial infarction, pulmonary embolism Gastrointestinal: gastroenteritis, stomatitis, appendicitis, ulcer, gastritis, intestinal obstruction, cholecystitis... fractured tooth J. Reinsert temporary crown k. Treat dry socket, cellulitis, gingivitis, etc. 1. Evaluate patient’s progress/response to therapeutic regime M

  7. The Role of the European Inspections in the European Educational Space--Echoes from Portugal Regarding the Assessment of Schools

    ERIC Educational Resources Information Center

    Costa, Estela; Pires, Ana Márcia

    2011-01-01

    This paper is an approach to the construction of a European educational space (Nóvoa & Lawn, 2002), which is due to new modes of regulation in education. The policy under consideration is the institutional evaluation of schools carried out by the Portuguese General Inspectorate of Education. The aim is to explore how concepts and policies get…

  8. Disappearing Act: Persistence and Attrition of Uniform Resource Locators (URLs) in an Open Access Medical Journal

    ERIC Educational Resources Information Center

    Nagaraja, Aragudige; Joseph, Shine A.; Polen, Hyla H.; Clauson, Kevin A.

    2011-01-01

    Purpose: The aim of this paper is to assess and catalogue the magnitude of URL attrition in a high-impact, open access (OA) general medical journal. Design/methodology/approach: All "Public Library of Science Medicine (PLoS Medicine)" articles for 2005-2007 were evaluated and the following items were assessed: number of entries per issue; type of…

  9. Development of an hypothesis for simulating anti-orthostatic bed rest

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.; Grounds, D. J.; Fitzjerrell, D. G.

    1978-01-01

    The Guyton model, modified by the addition of leg compartments and the effect of the gravity vector, was used to evaluate hypotheses describing leg dehydration and fluid shifts. While the study is not complete, the basic approach was shown to be useful by identifying important mechanisms, identifying systems which need further experimental description and by assisting in the development of a general hypothesis.

  10. Section V: Conclusions

    DTIC Science & Technology

    2001-06-01

    aeromedical issues. The section concludes with a sample review of a candidate medication, in this case losartan potassium, with a possible approach to...as losartan , Allergic rhinitis is a very common condition appear to hold promise for use in aircrew and generally compatible with military aircrew...evaluating the aeromedical suitability of HI antihistamines for this condition in aircrew losartan led our group to designate losartan as a because of

  11. Weight of evidence approaches for the identification of endocrine disrupting properties of chemicals: Review and recommendations for EU regulatory application.

    PubMed

    Gross, Melanie; Green, Richard M; Weltje, Lennart; Wheeler, James R

    2017-12-01

    A Weight-of-evidence (WoE) evaluation should be applied in assessing all the available data for the identification of endocrine disrupting (ED) properties of chemicals. The European Commission draft acts specifying criteria under the biocidal products and plant protection products regulations require that WoE is implemented for the assessment of such products. However, only some general considerations and principles of how a WoE should be conducted are provided. This paper reviews WoE approaches to distil key recommendations specifically for the evaluation of potential ED properties of chemicals. In a manner, which is consistent with existing, published WoE frameworks, the WoE evaluation of ED properties can be divided into four phases: 1) Definition of causal questions and data gathering and selection, 2) Review of individual studies, 3) Data integration and evaluation, and 4) Drawing conclusions based on inferences. Recommendations are made on how to conduct each phase robustly and transparently to help guide the WoE evaluation of potential endocrine disrupting properties of chemicals within a European regulatory context. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Measuring the impact of alcohol-related disorders on quality of life through general population preferences.

    PubMed

    Rodríguez-Míguez, Eva; Mosquera Nogueira, Jacinto

    To estimate the intangible effects of alcohol misuse on the drinker's quality of life, based on general population preferences METHODS: The most important effects (dimensions) were identified by means of two focus groups conducted with patients and specialists. The levels of these dimensions were combined to yield different scenarios. A sample of 300 people taken from the general Spanish population evaluated a subset of these scenarios, selected by using a fractional factorial design. We used the probability lottery equivalent method to derive the utility score for the evaluated scenarios, and the random-effects regression model to estimate the relative importance of each dimension and to derive the utility score for the rest of scenarios not directly evaluated. Four main dimensions were identified (family, physical health, psychological health and social) and divided into three levels of intensity. We found a wide variation in the utilities associated with the scenarios directly evaluated (ranging from 0.09 to 0.78). The dimensions with the greatest relative importance were physical health (36.4%) and family consequences (31.3%), followed by psychological (20.5%) and social consequences (11.8%). Our findings confirm the benefits of adopting a heterogeneous approach to measure the effects of alcohol misuse. The estimated utilities could have both clinical and economic applications. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Channelized relevance vector machine as a numerical observer for cardiac perfusion defect detection task

    NASA Astrophysics Data System (ADS)

    Kalayeh, Mahdi M.; Marin, Thibault; Pretorius, P. Hendrik; Wernick, Miles N.; Yang, Yongyi; Brankov, Jovan G.

    2011-03-01

    In this paper, we present a numerical observer for image quality assessment, aiming to predict human observer accuracy in a cardiac perfusion defect detection task for single-photon emission computed tomography (SPECT). In medical imaging, image quality should be assessed by evaluating the human observer accuracy for a specific diagnostic task. This approach is known as task-based assessment. Such evaluations are important for optimizing and testing imaging devices and algorithms. Unfortunately, human observer studies with expert readers are costly and time-demanding. To address this problem, numerical observers have been developed as a surrogate for human readers to predict human diagnostic performance. The channelized Hotelling observer (CHO) with internal noise model has been found to predict human performance well in some situations, but does not always generalize well to unseen data. We have argued in the past that finding a model to predict human observers could be viewed as a machine learning problem. Following this approach, in this paper we propose a channelized relevance vector machine (CRVM) to predict human diagnostic scores in a detection task. We have previously used channelized support vector machines (CSVM) to predict human scores and have shown that this approach offers better and more robust predictions than the classical CHO method. The comparison of the proposed CRVM with our previously introduced CSVM method suggests that CRVM can achieve similar generalization accuracy, while dramatically reducing model complexity and computation time.

  14. Defining, illustrating and reflecting on logic analysis with an example from a professional development program.

    PubMed

    Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole

    2013-10-01

    Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Machine Learning Approach to Extract Diagnostic and Prognostic Thresholds: Application in Prognosis of Cardiovascular Mortality

    PubMed Central

    Mena, Luis J.; Orozco, Eber E.; Felix, Vanessa G.; Ostos, Rodolfo; Melgarejo, Jesus; Maestre, Gladys E.

    2012-01-01

    Machine learning has become a powerful tool for analysing medical domains, assessing the importance of clinical parameters, and extracting medical knowledge for outcomes research. In this paper, we present a machine learning method for extracting diagnostic and prognostic thresholds, based on a symbolic classification algorithm called REMED. We evaluated the performance of our method by determining new prognostic thresholds for well-known and potential cardiovascular risk factors that are used to support medical decisions in the prognosis of fatal cardiovascular diseases. Our approach predicted 36% of cardiovascular deaths with 80% specificity and 75% general accuracy. The new method provides an innovative approach that might be useful to support decisions about medical diagnoses and prognoses. PMID:22924062

  16. On the application of multilevel modeling in environmental and ecological studies

    USGS Publications Warehouse

    Qian, Song S.; Cuffney, Thomas F.; Alameddine, Ibrahim; McMahon, Gerard; Reckhow, Kenneth H.

    2010-01-01

    This paper illustrates the advantages of a multilevel/hierarchical approach for predictive modeling, including flexibility of model formulation, explicitly accounting for hierarchical structure in the data, and the ability to predict the outcome of new cases. As a generalization of the classical approach, the multilevel modeling approach explicitly models the hierarchical structure in the data by considering both the within- and between-group variances leading to a partial pooling of data across all levels in the hierarchy. The modeling framework provides means for incorporating variables at different spatiotemporal scales. The examples used in this paper illustrate the iterative process of model fitting and evaluation, a process that can lead to improved understanding of the system being studied.

  17. Team approach to treatment of the posttraumatic stiff hand. A case report.

    PubMed

    Morey, K R; Watson, A H

    1986-02-01

    Posttraumatic hand stiffness is a common but complex problem treated in many general clinics and in hand treatment centers. Although much information is available regarding various treatment procedures, the use of a team approach to evaluate and treat hand stiffness has not been examined thoroughly in the Journal. The problems of the patient with a stiff hand include both physical and psychological components that must be addressed in a structured manner. The clinical picture of posttraumatic hand stiffness involves edema, immobility, pain, and the inability to incorporate the affected extremity into daily activities. In this case report, we review the purpose and philosophy of the team approach to hand therapy and the clarification of responsibilities for physical therapy and occupational therapy intervention.

  18. Assuring safety without animal testing: Unilever's ongoing research programme to deliver novel ways to assure consumer safety.

    PubMed

    Westmoreland, Carl; Carmichael, Paul; Dent, Matt; Fentem, Julia; MacKay, Cameron; Maxwell, Gavin; Pease, Camilla; Reynolds, Fiona

    2010-01-01

    Assuring consumer safety without the generation of new animal data is currently a considerable challenge. However, through the application of new technologies and the further development of risk-based approaches for safety assessment, we remain confident it is ultimately achievable. For many complex, multi-organ consumer safety endpoints, the development, evaluation and application of new, non-animal approaches is hampered by a lack of biological understanding of the underlying mechanistic processes involved. The enormity of this scientific challenge should not be underestimated. To tackle this challenge a substantial research programme was initiated by Unilever in 2004 to critically evaluate the feasibility of a new conceptual approach based upon the following key components: 1.Developing new, exposure-driven risk assessment approaches. 2.Developing new biological (in vitro) and computer-based (in silico) predictive models. 3.Evaluating the applicability of new technologies for generating data (e.g. "omics", informatics) and for integrating new types of data (e.g. systems approaches) for risk-based safety assessment. Our research efforts are focussed in the priority areas of skin allergy, cancer and general toxicity (including inhaled toxicity). In all of these areas, a long-term investment is essential to increase the scientific understanding of the underlying biology and molecular mechanisms that we believe will ultimately form a sound basis for novel risk assessment approaches. Our research programme in these priority areas consists of in-house research as well as Unilever-sponsored academic research, involvement in EU-funded projects (e.g. Sens-it-iv, Carcinogenomics), participation in cross-industry collaborative research (e.g. Colipa, EPAA) and ongoing involvement with other scientific initiatives on non-animal approaches to risk assessment (e.g. UK NC3Rs, US "Human Toxicology Project" consortium).

  19. Evaluation of Visualization Software

    NASA Technical Reports Server (NTRS)

    Globus, Al; Uselton, Sam

    1995-01-01

    Visualization software is widely used in scientific and engineering research. But computed visualizations can be very misleading, and the errors are easy to miss. We feel that the software producing the visualizations must be thoroughly evaluated and the evaluation process as well as the results must be made available. Testing and evaluation of visualization software is not a trivial problem. Several methods used in testing other software are helpful, but these methods are (apparently) often not used. When they are used, the description and results are generally not available to the end user. Additional evaluation methods specific to visualization must also be developed. We present several useful approaches to evaluation, ranging from numerical analysis of mathematical portions of algorithms to measurement of human performance while using visualization systems. Along with this brief survey, we present arguments for the importance of evaluations and discussions of appropriate use of some methods.

  20. A randomized, single-blind cross-over design evaluating the effectiveness of an individually defined, targeted physical therapy approach in treatment of children with cerebral palsy.

    PubMed

    Franki, Inge; Van den Broeck, Christine; De Cat, Josse; Tijhuis, Wieke; Molenaers, Guy; Vanderstraeten, Guy; Desloovere, Kaat

    2014-10-01

    A pilot study to compare the effectiveness of an individual therapy program with the effects of a general physical therapy program. A randomized, single-blind cross-over design. Ten ambulant children with bilateral spastic cerebral palsy, age four to nine years. Participants were randomly assigned into a ten-week individually defined, targeted or a general program, followed by a cross-over. Evaluation was performed using the Gross Motor Function Measure-88 and three-dimensional gait analysis. General outcome parameters were Gross Motor Function Measure-88 scores, time and distance parameters, gait profile score and movement analysis profiles. Individual goal achievement was evaluated using z-scores for gait parameters and Goal Attainment Scale for gross motor function. No significant changes were observed regarding gross motor function. Only after individualized therapy, step- and stride-length increased significantly (p = 0.022; p = 0.017). Change in step-length was higher after the individualized program (p = 0.045). Within-group effects were found for the pelvis in transversal plane after the individualized program (p = 0.047) and in coronal plane after the general program (p = 0.047). Between-program differences were found for changes in the knee in sagittal plane, in the advantage of the individual program (p = 0.047). A median difference in z-score of 0.279 and 0.419 was measured after the general and individualized program, respectively. Functional goal attainment was higher after the individual therapy program compared with the general program (48 to 43.5). The results indicate slightly favorable effects towards the individualized program. To detect clinically significant changes, future studies require a minimal sample size of 72 to 90 participants. © The Author(s) 2014.

  1. Fine-Scale Exposure to Allergenic Pollen in the Urban Environment: Evaluation of Land Use Regression Approach.

    PubMed

    Hjort, Jan; Hugg, Timo T; Antikainen, Harri; Rusanen, Jarmo; Sofiev, Mikhail; Kukkonen, Jaakko; Jaakkola, Maritta S; Jaakkola, Jouni J K

    2016-05-01

    Despite the recent developments in physically and chemically based analysis of atmospheric particles, no models exist for resolving the spatial variability of pollen concentration at urban scale. We developed a land use regression (LUR) approach for predicting spatial fine-scale allergenic pollen concentrations in the Helsinki metropolitan area, Finland, and evaluated the performance of the models against available empirical data. We used grass pollen data monitored at 16 sites in an urban area during the peak pollen season and geospatial environmental data. The main statistical method was generalized linear model (GLM). GLM-based LURs explained 79% of the spatial variation in the grass pollen data based on all samples, and 47% of the variation when samples from two sites with very high concentrations were excluded. In model evaluation, prediction errors ranged from 6% to 26% of the observed range of grass pollen concentrations. Our findings support the use of geospatial data-based statistical models to predict the spatial variation of allergenic grass pollen concentrations at intra-urban scales. A remote sensing-based vegetation index was the strongest predictor of pollen concentrations for exposure assessments at local scales. The LUR approach provides new opportunities to estimate the relations between environmental determinants and allergenic pollen concentration in human-modified environments at fine spatial scales. This approach could potentially be applied to estimate retrospectively pollen concentrations to be used for long-term exposure assessments. Hjort J, Hugg TT, Antikainen H, Rusanen J, Sofiev M, Kukkonen J, Jaakkola MS, Jaakkola JJ. 2016. Fine-scale exposure to allergenic pollen in the urban environment: evaluation of land use regression approach. Environ Health Perspect 124:619-626; http://dx.doi.org/10.1289/ehp.1509761.

  2. Variation of student numerical and figural reasoning approaches by pattern generalization type, strategy use and grade level

    NASA Astrophysics Data System (ADS)

    El Mouhayar, Rabih; Jurdak, Murad

    2016-02-01

    This paper explored variation of student numerical and figural reasoning approaches across different pattern generalization types and across grade level. An instrument was designed for this purpose. The instrument was given to a sample of 1232 students from grades 4 to 11 from five schools in Lebanon. Analysis of data showed that the numerical reasoning approach seems to be more dominant than the figural reasoning approach for the near and far pattern generalization types but not for the immediate generalization type. The findings showed that for the recursive strategy, the numerical reasoning approach seems to be more dominant than the figural reasoning approach for each of the three pattern generalization types. However, the figural reasoning approach seems to be more dominant than the numerical reasoning approach for the functional strategy, for each generalization type. The findings also showed that the numerical reasoning was more dominant than the figural reasoning in lower grade levels (grades 4 and 5) for each generalization type. In contrast, the figural reasoning became more dominant than the numerical reasoning in the upper grade levels (grades 10 and 11).

  3. A Two-Stage Algorithm for Origin-Destination Matrices Estimation Considering Dynamic Dispersion Parameter for Route Choice

    PubMed Central

    Wang, Yong; Ma, Xiaolei; Liu, Yong; Gong, Ke; Henricakson, Kristian C.; Xu, Maozeng; Wang, Yinhai

    2016-01-01

    This paper proposes a two-stage algorithm to simultaneously estimate origin-destination (OD) matrix, link choice proportion, and dispersion parameter using partial traffic counts in a congested network. A non-linear optimization model is developed which incorporates a dynamic dispersion parameter, followed by a two-stage algorithm in which Generalized Least Squares (GLS) estimation and a Stochastic User Equilibrium (SUE) assignment model are iteratively applied until the convergence is reached. To evaluate the performance of the algorithm, the proposed approach is implemented in a hypothetical network using input data with high error, and tested under a range of variation coefficients. The root mean squared error (RMSE) of the estimated OD demand and link flows are used to evaluate the model estimation results. The results indicate that the estimated dispersion parameter theta is insensitive to the choice of variation coefficients. The proposed approach is shown to outperform two established OD estimation methods and produce parameter estimates that are close to the ground truth. In addition, the proposed approach is applied to an empirical network in Seattle, WA to validate the robustness and practicality of this methodology. In summary, this study proposes and evaluates an innovative computational approach to accurately estimate OD matrices using link-level traffic flow data, and provides useful insight for optimal parameter selection in modeling travelers’ route choice behavior. PMID:26761209

  4. Systematic review and meta-analysis of the performance of clinical risk assessment instruments for screening for osteoporosis or low bone density

    PubMed Central

    Edwards, D. L.; Saleh, A. A.; Greenspan, S. L.

    2015-01-01

    Summary We performed a systematic review and meta-analysis of the performance of clinical risk assessment instruments for screening for DXA-determined osteoporosis or low bone density. Commonly evaluated risk instruments showed high sensitivity approaching or exceeding 90 % at particular thresholds within various populations but low specificity at thresholds required for high sensitivity. Simpler instruments, such as OST, generally performed as well as or better than more complex instruments. Introduction The purpose of the study is to systematically review the performance of clinical risk assessment instruments for screening for dual-energy X-ray absorptiometry (DXA)-determined osteoporosis or low bone density. Methods Systematic review and meta-analysis were performed. Multiple literature sources were searched, and data extracted and analyzed from included references. Results One hundred eight references met inclusion criteria. Studies assessed many instruments in 34 countries, most commonly the Osteoporosis Self-Assessment Tool (OST), the Simple Calculated Osteoporosis Risk Estimation (SCORE) instrument, the Osteoporosis Self-Assessment Tool for Asians (OSTA), the Osteoporosis Risk Assessment Instrument (ORAI), and body weight criteria. Meta-analyses of studies evaluating OST using a cutoff threshold of <1 to identify US postmenopausal women with osteoporosis at the femoral neck provided summary sensitivity and specificity estimates of 89 % (95%CI 82–96 %) and 41 % (95%CI 23–59 %), respectively. Meta-analyses of studies evaluating OST using a cutoff threshold of 3 to identify US men with osteoporosis at the femoral neck, total hip, or lumbar spine provided summary sensitivity and specificity estimates of 88 % (95%CI 79–97 %) and 55 % (95%CI 42–68 %), respectively. Frequently evaluated instruments each had thresholds and populations for which sensitivity for osteoporosis or low bone mass detection approached or exceeded 90 % but always with a trade-off of relatively low specificity. Conclusions Commonly evaluated clinical risk assessment instruments each showed high sensitivity approaching or exceeding 90 % for identifying individuals with DXA-determined osteoporosis or low BMD at certain thresholds in different populations but low specificity at thresholds required for high sensitivity. Simpler instruments, such as OST, generally performed as well as or better than more complex instruments. PMID:25644147

  5. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach), and iii) process-level evaluation at climate time-scales. The advantages and disadvantages of each approach will be identified and discussed, and some thoughts about possible future developments will be given.

  6. Implementing Generalized Additive Models to Estimate the Expected Value of Sample Information in a Microsimulation Model: Results of Three Case Studies.

    PubMed

    Rabideau, Dustin J; Pei, Pamela P; Walensky, Rochelle P; Zheng, Amy; Parker, Robert A

    2018-02-01

    The expected value of sample information (EVSI) can help prioritize research but its application is hampered by computational infeasibility, especially for complex models. We investigated an approach by Strong and colleagues to estimate EVSI by applying generalized additive models (GAM) to results generated from a probabilistic sensitivity analysis (PSA). For 3 potential HIV prevention and treatment strategies, we estimated life expectancy and lifetime costs using the Cost-effectiveness of Preventing AIDS Complications (CEPAC) model, a complex patient-level microsimulation model of HIV progression. We fitted a GAM-a flexible regression model that estimates the functional form as part of the model fitting process-to the incremental net monetary benefits obtained from the CEPAC PSA. For each case study, we calculated the expected value of partial perfect information (EVPPI) using both the conventional nested Monte Carlo approach and the GAM approach. EVSI was calculated using the GAM approach. For all 3 case studies, the GAM approach consistently gave similar estimates of EVPPI compared with the conventional approach. The EVSI behaved as expected: it increased and converged to EVPPI for larger sample sizes. For each case study, generating the PSA results for the GAM approach required 3 to 4 days on a shared cluster, after which EVPPI and EVSI across a range of sample sizes were evaluated in minutes. The conventional approach required approximately 5 weeks for the EVPPI calculation alone. Estimating EVSI using the GAM approach with results from a PSA dramatically reduced the time required to conduct a computationally intense project, which would otherwise have been impractical. Using the GAM approach, we can efficiently provide policy makers with EVSI estimates, even for complex patient-level microsimulation models.

  7. Semi-quantitative evaluation of fecal contamination potential by human and ruminant sources using multiple lines of evidence

    USGS Publications Warehouse

    Stoeckel, D.M.; Stelzer, E.A.; Stogner, R.W.; Mau, D.P.

    2011-01-01

    Protocols for microbial source tracking of fecal contamination generally are able to identify when a source of contamination is present, but thus far have been unable to evaluate what portion of fecal-indicator bacteria (FIB) came from various sources. A mathematical approach to estimate relative amounts of FIB, such as Escherichia coli, from various sources based on the concentration and distribution of microbial source tracking markers in feces was developed. The approach was tested using dilute fecal suspensions, then applied as part of an analytical suite to a contaminated headwater stream in the Rocky Mountains (Upper Fountain Creek, Colorado). In one single-source fecal suspension, a source that was not present could not be excluded because of incomplete marker specificity; however, human and ruminant sources were detected whenever they were present. In the mixed-feces suspension (pet and human), the minority contributor (human) was detected at a concentration low enough to preclude human contamination as the dominant source of E. coli to the sample. Without the semi-quantitative approach described, simple detects of human-associated marker in stream samples would have provided inaccurate evidence that human contamination was a major source of E. coli to the stream. In samples from Upper Fountain Creek the pattern of E. coli, general and host-associated microbial source tracking markers, nutrients, and wastewater-associated chemical detections-augmented with local observations and land-use patterns-indicated that, contrary to expectations, birds rather than humans or ruminants were the predominant source of fecal contamination to Upper Fountain Creek. This new approach to E. coli allocation, validated by a controlled study and tested by application in a relatively simple setting, represents a widely applicable step forward in the field of microbial source tracking of fecal contamination. ?? 2011 Elsevier Ltd.

  8. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  9. Another Approach to Generalizing the Mean

    ERIC Educational Resources Information Center

    Matejas, J.; Bahovec, V.

    2008-01-01

    This article presents a new approach to generalizing the definition of means. By this approach we easily obtain generalized means which are quite different from standard arithmetic, geometric and harmonic means.

  10. Bayesian correction for covariate measurement error: A frequentist evaluation and comparison with regression calibration.

    PubMed

    Bartlett, Jonathan W; Keogh, Ruth H

    2018-06-01

    Bayesian approaches for handling covariate measurement error are well established and yet arguably are still relatively little used by researchers. For some this is likely due to unfamiliarity or disagreement with the Bayesian inferential paradigm. For others a contributory factor is the inability of standard statistical packages to perform such Bayesian analyses. In this paper, we first give an overview of the Bayesian approach to handling covariate measurement error, and contrast it with regression calibration, arguably the most commonly adopted approach. We then argue why the Bayesian approach has a number of statistical advantages compared to regression calibration and demonstrate that implementing the Bayesian approach is usually quite feasible for the analyst. Next, we describe the closely related maximum likelihood and multiple imputation approaches and explain why we believe the Bayesian approach to generally be preferable. We then empirically compare the frequentist properties of regression calibration and the Bayesian approach through simulation studies. The flexibility of the Bayesian approach to handle both measurement error and missing data is then illustrated through an analysis of data from the Third National Health and Nutrition Examination Survey.

  11. Consumer perceptions of prescription drug websites: a pilot study.

    PubMed

    Wymer, Walter

    2010-04-01

    Consumer perceptions of the information content contained on prescription drug websites was of interest in this investigation. Twenty branded prescription drugs were selected because they were evaluated as being poor consumer choices for safety reasons or because better alternatives existed. Study participants visited each of 20 websites for the selected drugs, and then they answered a series of questions for each website, in order to evaluate each website's information content. Participants, without knowing the selected prescription drugs were selected because they were problematic, reported that the drug company information was complete, fully presenting benefit and risk information, without being false or misleading in any respect. Pricing information was generally not provided by drug companies. Alternative medicines, treatments, and behavioral approaches for dealing with an illness or health condition were generally not part of the information provided by drug companies. Public policy implications are also discussed.

  12. Discrete adjoint of fractional step Navier-Stokes solver in generalized coordinates

    NASA Astrophysics Data System (ADS)

    Wang, Mengze; Mons, Vincent; Zaki, Tamer

    2017-11-01

    Optimization and control in transitional and turbulent flows require evaluation of gradients of the flow state with respect to the problem parameters. Using adjoint approaches, these high-dimensional gradients can be evaluated with a similar computational cost as the forward Navier-Stokes simulations. The adjoint algorithm can be obtained by discretizing the continuous adjoint Navier-Stokes equations or by deriving the adjoint to the discretized Navier-Stokes equations directly. The latter algorithm is necessary when the forward-adjoint relations must be satisfied to machine precision. In this work, our forward model is the fractional step solution to the Navier-Stokes equations in generalized coordinates, proposed by Rosenfeld, Kwak & Vinokur. We derive the corresponding discrete adjoint equations. We also demonstrate the accuracy of the combined forward-adjoint model, and its application to unsteady wall-bounded flows. This work has been partially funded by the Office of Naval Research (Grant N00014-16-1-2542).

  13. Economic gains stimulate negative evaluations of corporate sustainability initiatives

    NASA Astrophysics Data System (ADS)

    Makov, Tamar; Newman, George E.

    2016-09-01

    In recent years, many organizations have sought to align their financial goals with environmental ones by identifying strategies that maximize profits while minimizing environmental impacts. Examples of this `win-win' approach can be found across a wide range of industries, from encouraging the reuse of hotel towels, to the construction of energy efficient buildings, to the large-scale initiatives of multi-national corporations. Although win-win strategies are generally thought to reflect positively on the organizations that employ them, here we find that people tend to respond negatively to the notion of profiting from environmental initiatives. In fact, observers may evaluate environmental win-wins less favourably than profit-seeking strategies that have no environmental benefits. The present studies suggest that how those initiatives are communicated to the general public may be of central importance. Therefore, organizations would benefit from carefully crafting the discourse around their win-win initiatives to ensure that they avoid this type of backlash.

  14. Closed-Form Jensen-Renyi Divergence for Mixture of Gaussians and Applications to Group-Wise Shape Registration*

    PubMed Central

    Wang, Fei; Syeda-Mahmood, Tanveer; Vemuri, Baba C.; Beymer, David; Rangarajan, Anand

    2010-01-01

    In this paper, we propose a generalized group-wise non-rigid registration strategy for multiple unlabeled point-sets of unequal cardinality, with no bias toward any of the given point-sets. To quantify the divergence between the probability distributions – specifically Mixture of Gaussians – estimated from the given point sets, we use a recently developed information-theoretic measure called Jensen-Renyi (JR) divergence. We evaluate a closed-form JR divergence between multiple probabilistic representations for the general case where the mixture models differ in variance and the number of components. We derive the analytic gradient of the divergence measure with respect to the non-rigid registration parameters, and apply it to numerical optimization of the group-wise registration, leading to a computationally efficient and accurate algorithm. We validate our approach on synthetic data, and evaluate it on 3D cardiac shapes. PMID:20426043

  15. Closed-form Jensen-Renyi divergence for mixture of Gaussians and applications to group-wise shape registration.

    PubMed

    Wang, Fei; Syeda-Mahmood, Tanveer; Vemuri, Baba C; Beymer, David; Rangarajan, Anand

    2009-01-01

    In this paper, we propose a generalized group-wise non-rigid registration strategy for multiple unlabeled point-sets of unequal cardinality, with no bias toward any of the given point-sets. To quantify the divergence between the probability distributions--specifically Mixture of Gaussians--estimated from the given point sets, we use a recently developed information-theoretic measure called Jensen-Renyi (JR) divergence. We evaluate a closed-form JR divergence between multiple probabilistic representations for the general case where the mixture models differ in variance and the number of components. We derive the analytic gradient of the divergence measure with respect to the non-rigid registration parameters, and apply it to numerical optimization of the group-wise registration, leading to a computationally efficient and accurate algorithm. We validate our approach on synthetic data, and evaluate it on 3D cardiac shapes.

  16. Planning attitudes, lay philosophies, and water allocation: A preliminary analysis and research agenda

    NASA Astrophysics Data System (ADS)

    Syme, Geoffrey J.; Nancarrow, Blair E.

    Despite the important societal consequences of water policy, community attitudes toward planning, ethics, and equity for allocation of water have received little research attention. This preliminary research was conducted to assess the range and structure of planning attitudes and equity and ethical considerations which might be relevant to the general public's evaluation of water allocation systems. The relationship of these to priorities for water allocation were also examined. The results showed a complex structure for planning attitudes. There were also generalized but clearly defined community approaches to water allocation. A number of significant relationships between planning attitudes and philosophies of allocation were shown. Planning attitudes also related to priorities for water allocation. In practical terms the research provides some preliminary, ethically based evaluative criteria which could be applied to allocation decision-making systems. Theoretical research possibilities are also outlined.

  17. Clinical and histopathological evaluation of 16 dogs with T-zone lymphoma

    PubMed Central

    MIZUTANI, Noriyuki; GOTO-KOSHINO, Yuko; TAKAHASHI, Masashi; UCHIDA, Kazuyuki; TSUJIMOTO, Hajime

    2016-01-01

    Clinical and histopathological characteristics of 16 dogs with nodal paracortical (T-zone) lymphoma (TZL) were evaluated. At initial examination, generalized lymphadenopathy was found in all dogs, and peripheral lymphocytosis was found in 10 of the 16 dogs. At initial diagnosis or during the disease course, 8 dogs (50%) were affected with demodicosis. Immunohistochemical analysis for CD3, CD20 and CD25 was performed for 6 dogs with TZL; the tumor cells were positive for CD3 and CD25 and negative for CD20. Median overall survival time was 938 days. A watchful waiting approach was adopted for 6 cases (38%), and 5 of the 6 dogs were still alive at the end of follow-up. The clinical course of TZL in dogs is generally indolent; however, many cases develop a variety of infectious and other neoplastic diseases after the diagnosis of TZL. PMID:27098109

  18. International and National Expert Group Evaluations: Biological/Health Effects of Radiofrequency Fields

    PubMed Central

    Vijayalaxmi; Scarfi, Maria R.

    2014-01-01

    The escalated use of various wireless communication devices, which emit non-ionizing radiofrequency (RF) fields, have raised concerns among the general public regarding the potential adverse effects on human health. During the last six decades, researchers have used different parameters to investigate the effects of in vitro and in vivo exposures of animals and humans or their cells to RF fields. Data reported in peer-reviewed scientific publications were contradictory: some indicated effects while others did not. International organizations have considered all of these data as well as the observations reported in human epidemiological investigations to set-up the guidelines or standards (based on the quality of published studies and the “weight of scientific evidence” approach) for RF exposures in occupationally exposed individuals and the general public. Scientists with relevant expertise in various countries have also considered the published data to provide the required scientific information for policy-makers to develop and disseminate authoritative health information to the general public regarding RF exposures. This paper is a compilation of the conclusions, on the biological effects of RF exposures, from various national and international expert groups, based on their analyses. In general, the expert groups suggested a reduction in exposure levels, precautionary approach, and further research. PMID:25211777

  19. Pterional approach versus unilateral frontal approach on tuberculum sellae meningioma: Single centre experiences

    PubMed Central

    Arifin, Muhammad Zafrullah; Mardjono, Ignatius; Sidabutar, Roland; Wirjomartani, Beny Atmadja; Faried, Ahmad

    2012-01-01

    Introduction: Tuberculum Sellae Meningioma is one of the most challenging surgeries among neurosurgeons. Many approaches have been established in the effort of removing the tumor and some of them are supported by an advanced neurosurgical technology. In this study, we aim to compare the efficacy of the two most common approaches, the pterional and the unilateral frontal. Materials and Methods: This was a restrospective study that aimed to observe the efficacy of the two most common approaches used in our center, the pterional and the unilateral frontal, in resecting the tuberculum sellae meningioma, which was held in Dr. Hasan Sadikin General Hospital, Bandung, from July 2007-July 2010. Twenty patients were enrolled with half of them operated by the pterional approach and the rest by unilateral frontal approach. We evaluated six parameters: tumor size, degree of tumor removal, surgery duration, post-operative cerebral edema, patients' outcome, and length of stay, which were evaluated to take measure of the efficacy of each procedure. Results: We found that the pterional approach gave more advantages than the unilateral frontal. Total tumor removal, especially in tumor size ≥ 3 cm was achieved in a greater number of subjects in the pterional (P<0.023). Other advantages of the pterional compared to the unilateral frontal were a shorter surgical duration (P=0.024), shorter length of stay (P=0.009) and less frequency of post-operative cerebral edema incidence (P=0.023). Conclusion: According to our facilities and conditions, it seems that the pterional approach have more advantages than the unilateral frontal approach in tuberculum sellae meningioma surgery. PMID:22639687

  20. Arrow-wing supersonic cruise aircraft structural design concepts evaluation. Volume 1: Sections 1 through 6

    NASA Technical Reports Server (NTRS)

    Sakata, I. F.; Davis, G. W.

    1975-01-01

    The structural approach best suited for the design of a Mach 2.7 arrow-wing supersonic cruise aircraft was investigated. Results, procedures, and principal justification of results are presented. Detailed substantiation data are given. In general, each major analysis is presented sequentially in separate sections to provide continuity in the flow of the design concepts analysis effort. In addition to the design concepts evaluation and the detailed engineering design analyses, supporting tasks encompassing: (1) the controls system development; (2) the propulsion-airframe integration study; and (3) the advanced technology assessment are presented.

  1. Interim reliability-evaluation program: analysis of the Browns Ferry, Unit 1, nuclear plant. Appendix C - sequence quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1982-07-01

    This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.

  2. Development and testing of a contamination potential mapping system for a portion of the General Separations Area, Savannah River Site, South Carolina

    USGS Publications Warehouse

    Rine, J.M.; Berg, R.C.; Shafer, J.M.; Covington, E.R.; Reed, J.K.; Bennett, C.B.; Trudnak, J.E.

    1998-01-01

    A methodology was developed to evaluate and map the contamination potential or aquifer sensitivity of the upper groundwater flow system of a portion of the General Separations Area (GSA) at the Department of Energy's Savannah River Site (SRS) in South Carolina. A Geographic Information System (GIS) was used to integrate diverse subsurface geologic data, soils data, and hydrology utilizing a stack-unit mapping approach to construct mapping layers. This is the first time that such an approach has been used to delineate the hydrogeology of a coastal plain environment. Unit surface elevation maps were constructed for the tops of six Tertiary units derived from over 200 boring logs. Thickness or isopach maps were created for five hydrogeologic units by differencing top and basal surface elevations. The geologic stack-unit map was created by stacking the five isopach maps and adding codes for each stack-unit polygon. Stacked-units were rated according to their hydrogeologic properties and ranked using a logarithmic approach (utility theory) to establish a contamination potential index. Colors were assigned to help display relative importance of stacked-units in preventing or promoting transport of contaminants. The sensitivity assessment included the effects of surface soils on contaminants which are particularly important for evaluating potential effects from surface spills. Hydrogeologic/hydrologic factors did not exhibit sufficient spatial variation to warrant incorporation into contamination potential assessment. Development of this contamination potential mapping system provides a useful tool for site planners, environmental scientists, and regulatory agencies.A methodology was developed to evaluate and map the contamination potential or aquifer sensitivity of the upper groundwater flow system of a portion of the General Separations Area (GSA) at the Department of Energy's Savannah River Site (SRS) in South Carolina. A Geographic Information System (GIS) was used to integrate diverse subsurface geologic data, soils data, and hydrology utilizing a stack-unit mapping approach to construct mapping layers. This is the first time that such an approach has been used to delineate the hydrogeology of a coastal plain environment. Unit surface elevation maps were constructed for the tops of six Tertiary units derived from over 200 boring logs. Thickness or isopach maps were created for five hydrogeologic units by differencing top and basal surface elevations. The geologic stack-unit map was created by stacking the five isopach maps and adding codes for each stack-unit polygon. Stacked-units were rated according to their hydrogeologic properties and ranked using a logarithmic approach (utility theory) to establish a contamination potential index. Colors were assigned to help display relative importance of stacked-units in preventing or promoting transport of contaminants. The sensitivity assessment included the effects of surface soils on contaminants which are particularly important for evaluating potential effects from surface spills. Hydrogeologic/hydrologic factors did not exhibit sufficient spatial variation to warrant incorporation into contamination potential assessment. Development of this contamination potential mapping system provides a useful tool for site planners, environmental scientists, and regulatory agencies.

  3. Quality assurance strategies for investigating IAQ problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collett, C.W.; Ross, J.A.; Sterling, E.M.

    Thousands of buildings have now been investigated throughout North America and western Europe. The evaluative strategies and protocols used by various investigators have been described in the scientific and protocols used by various investigators have been described in the scientific and technical literature, including those used by government agencies, private consultants, researchers, and physicians. Review of these strategies shows a consistency and commonly in approach, despite differences in terminology and organization. Most of the published protocols recognize the need to employ a multidisciplinary approach to the evaluation of indoor environmental problems, an approach that views buildings as complex, dynamic systems.more » The multidisciplinary approaches advocated by investigators gather information about the physical building (architectural), the mechanical systems that control indoor environmental conditions (engineering), the type and extent of occupant health and comfort concerns (medical), the objective quality of the air (industrial hygiene) and the occupants subjective perceptions of conditions in their work environment (social science). These components have generally been organized into a series of steps or phases, with each phase extending the information gathered from the preceding phase until a point when the causes of problems may be identified.« less

  4. An approach to developing numeric water quality criteria for coastal waters using the SeaWiFS Satellite Data Record.

    PubMed

    Schaeffer, Blake A; Hagy, James D; Conmy, Robyn N; Lehrter, John C; Stumpf, Richard P

    2012-01-17

    Human activities on land increase nutrient loads to coastal waters, which can increase phytoplankton production and biomass and associated ecological impacts. Numeric nutrient water quality standards are needed to protect coastal waters from eutrophication impacts. The Environmental Protection Agency determined that numeric nutrient criteria were necessary to protect designated uses of Florida's waters. The objective of this study was to evaluate a reference condition approach for developing numeric water quality criteria for coastal waters, using data from Florida. Florida's coastal waters have not been monitored comprehensively via field sampling to support numeric criteria development. However, satellite remote sensing had the potential to provide adequate data. Spatial and temporal measures of SeaWiFS OC4 chlorophyll-a (Chl(RS)-a, mg m(-3)) were resolved across Florida's coastal waters between 1997 and 2010 and compared with in situ measurements. Statistical distributions of Chl(RS)-a were evaluated to determine a quantitative reference baseline. A binomial approach was implemented to consider how new data could be assessed against the criteria. The proposed satellite remote sensing approach to derive numeric criteria may be generally applicable to other coastal waters.

  5. An Approach to Developing Numeric Water Quality Criteria for Coastal Waters Using the SeaWiFS Satellite Data Record

    PubMed Central

    2011-01-01

    Human activities on land increase nutrient loads to coastal waters, which can increase phytoplankton production and biomass and associated ecological impacts. Numeric nutrient water quality standards are needed to protect coastal waters from eutrophication impacts. The Environmental Protection Agency determined that numeric nutrient criteria were necessary to protect designated uses of Florida’s waters. The objective of this study was to evaluate a reference condition approach for developing numeric water quality criteria for coastal waters, using data from Florida. Florida’s coastal waters have not been monitored comprehensively via field sampling to support numeric criteria development. However, satellite remote sensing had the potential to provide adequate data. Spatial and temporal measures of SeaWiFS OC4 chlorophyll-a (ChlRS-a, mg m–3) were resolved across Florida’s coastal waters between 1997 and 2010 and compared with in situ measurements. Statistical distributions of ChlRS-a were evaluated to determine a quantitative reference baseline. A binomial approach was implemented to consider how new data could be assessed against the criteria. The proposed satellite remote sensing approach to derive numeric criteria may be generally applicable to other coastal waters. PMID:22192062

  6. Moving beyond qualitative evaluations of Bayesian models of cognition.

    PubMed

    Hemmer, Pernille; Tauber, Sean; Steyvers, Mark

    2015-06-01

    Bayesian models of cognition provide a powerful way to understand the behavior and goals of individuals from a computational point of view. Much of the focus in the Bayesian cognitive modeling approach has been on qualitative model evaluations, where predictions from the models are compared to data that is often averaged over individuals. In many cognitive tasks, however, there are pervasive individual differences. We introduce an approach to directly infer individual differences related to subjective mental representations within the framework of Bayesian models of cognition. In this approach, Bayesian data analysis methods are used to estimate cognitive parameters and motivate the inference process within a Bayesian cognitive model. We illustrate this integrative Bayesian approach on a model of memory. We apply the model to behavioral data from a memory experiment involving the recall of heights of people. A cross-validation analysis shows that the Bayesian memory model with inferred subjective priors predicts withheld data better than a Bayesian model where the priors are based on environmental statistics. In addition, the model with inferred priors at the individual subject level led to the best overall generalization performance, suggesting that individual differences are important to consider in Bayesian models of cognition.

  7. Evaluating Adaptive Governance Approaches to Sustainable Water Management in North-West Thailand

    NASA Astrophysics Data System (ADS)

    Clark, Julian R. A.; Semmahasak, Chutiwalanch

    2013-04-01

    Adaptive governance is advanced as a potent means of addressing institutional fit of natural resource systems with prevailing modes of political-administrative management. Its advocates also argue that it enhances participatory and learning opportunities for stakeholders over time. Yet an increasing number of studies demonstrate real difficulties in implementing adaptive governance `solutions'. This paper builds on these debates by examining the introduction of adaptive governance to water management in Chiang Mai province, north-west Thailand. The paper considers, first, the limitations of current water governance modes at the provincial scale, and the rationale for implementation of an adaptive approach. The new approach is then critically examined, with its initial performance and likely future success evaluated by (i) analysis of water stakeholders' opinions of its first year of operation; and (ii) comparison of its governance attributes against recent empirical accounts of implementation difficulty and failure of adaptive governance of natural resource management more generally. The analysis confirms the potentially significant role that the new approach can play in brokering and resolving the underlying differences in stakeholder representation and knowledge construction at the heart of the prevailing water governance modes in north-west Thailand.

  8. Technical approach to groundwater restoration. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-11-01

    The Technical Approach to Groundwater Restoration (TAGR) provides general technical guidance to implement the groundwater restoration phase of the Uranium Mill Tailings Remedial Action (UMTRA) Project. The TAGR includes a brief overview of the surface remediation and groundwater restoration phases of the UMTRA Project and describes the regulatory requirements, the National Environmental Policy Act (NEPA) process, and regulatory compliance. A section on program strategy discusses program optimization, the role of risk assessment, the observational approach, strategies for meeting groundwater cleanup standards, and remedial action decision-making. A section on data requirements for groundwater restoration evaluates the data quality objectives (DQO) andmore » minimum data required to implement the options and comply with the standards. A section on sits implementation explores the development of a conceptual site model, approaches to site characterization, development of remedial action alternatives, selection of the groundwater restoration method, and remedial design and implementation in the context of site-specific documentation in the site observational work plan (SOWP) and the remedial action plan (RAP). Finally, the TAGR elaborates on groundwater monitoring necessary to evaluate compliance with the groundwater cleanup standards and protection of human health and the environment, and outlines licensing procedures.« less

  9. Bioactives from microalgal dinoflagellates.

    PubMed

    Gallardo-Rodríguez, J; Sánchez-Mirón, A; García-Camacho, F; López-Rosales, L; Chisti, Y; Molina-Grima, E

    2012-01-01

    Dinoflagellate microalgae are an important source of marine biotoxins. Bioactives from dinoflagellates are attracting increasing attention because of their impact on the safety of seafood and potential uses in biomedical, toxicological and pharmacological research. Here we review the potential applications of dinoflagellate toxins and the methods for producing them. Only sparing quantities of dinoflagellate toxins are generally available and this hinders bioactivity characterization and evaluation in possible applications. Approaches to production of increased quantities of dinoflagellate bioactives are discussed. Although many dinoflagellates are fragile and grow slowly, controlled culture in bioreactors appears to be generally suitable for producing many of the metabolites of interest. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Multiobjective Aerodynamic Shape Optimization Using Pareto Differential Evolution and Generalized Response Surface Metamodels

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.

    2004-01-01

    Differential Evolution (DE) is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. The DE algorithm has been recently extended to multiobjective optimization problem by using a Pareto-based approach. In this paper, a Pareto DE algorithm is applied to multiobjective aerodynamic shape optimization problems that are characterized by computationally expensive objective function evaluations. To improve computational expensive the algorithm is coupled with generalized response surface meta-models based on artificial neural networks. Results are presented for some test optimization problems from the literature to demonstrate the capabilities of the method.

  11. Employee Health Behaviors, Self-Reported Health Status, and Association With Absenteeism: Comparison With the General Population.

    PubMed

    Yun, Young Ho; Sim, Jin Ah; Park, Eun-Gee; Park, June Dong; Noh, Dong-Young

    2016-09-01

    To perform a comparison between health behaviors and health status of employees with those of the general population, to evaluate the association between employee health behaviors, health status, and absenteeism. Cross-sectional study enrolled 2433 employees from 16 Korean companies in 2014, and recruited 1000 general population randomly in 2012. The distribution of employee health behaviors, health status, and association with absenteeism were assessed. Employees had significantly worse health status and low rates of health behaviors maintenance compared with the general population. Multiple logistic regression model revealed that regular exercise, smoking cessation, work life balance, proactive living, religious practice, and good physical health status were associated with lower absenteeism. Maintaining health behaviors and having good health status were associated with less absenteeism. This study suggests investment of multidimensional health approach in workplace health and wellness (WHW) programs.

  12. Adapting generalization tools to physiographic diversity for the united states national hydrography dataset

    USGS Publications Warehouse

    Buttenfield, B.P.; Stanislawski, L.V.; Brewer, C.A.

    2011-01-01

    This paper reports on generalization and data modeling to create reduced scale versions of the National Hydrographic Dataset (NHD) for dissemination through The National Map, the primary data delivery portal for USGS. Our approach distinguishes local differences in physiographic factors, to demonstrate that knowledge about varying terrain (mountainous, hilly or flat) and varying climate (dry or humid) can support decisions about algorithms, parameters, and processing sequences to create generalized, smaller scale data versions which preserve distinct hydrographic patterns in these regions. We work with multiple subbasins of the NHD that provide a range of terrain and climate characteristics. Specifically tailored generalization sequences are used to create simplified versions of the high resolution data, which was compiled for 1:24,000 scale mapping. Results are evaluated cartographically and metrically against a medium resolution benchmark version compiled for 1:100,000, developing coefficients of linear and areal correspondence.

  13. A multi-harmonic generalized energy balance method for studying autonomous oscillations of nonlinear conservative systems

    NASA Astrophysics Data System (ADS)

    Balaji, Nidish Narayanaa; Krishna, I. R. Praveen; Padmanabhan, C.

    2018-05-01

    The Harmonic Balance Method (HBM) is a frequency-domain based approximation approach used for obtaining the steady state periodic behavior of forced dynamical systems. Intrinsically these systems are non-autonomous and the method offers many computational advantages over time-domain methods when the fundamental period of oscillation is known (generally fixed as the forcing period itself or a corresponding sub-harmonic if such behavior is expected). In the current study, a modified approach, based on He's Energy Balance Method (EBM), is applied to obtain the periodic solutions of conservative systems. It is shown that by this approach, periodic solutions of conservative systems on iso-energy manifolds in the phase space can be obtained very efficiently. The energy level provides the additional constraint on the HBM formulation, which enables the determination of the period of the solutions. The method is applied to the linear harmonic oscillator, a couple of nonlinear oscillators, the elastic pendulum and the Henon-Heiles system. The approach is used to trace the bifurcations of the periodic solutions of the last two, being 2 degree-of-freedom systems demonstrating very rich dynamical behavior. In the process, the advantages offered by the current formulation of the energy balance is brought out. A harmonic perturbation approach is used to evaluate the stability of the solutions for the bifurcation diagram.

  14. Accuracy Estimation and Parameter Advising for Protein Multiple Sequence Alignment

    PubMed Central

    DeBlasio, Dan

    2013-01-01

    Abstract We develop a novel and general approach to estimating the accuracy of multiple sequence alignments without knowledge of a reference alignment, and use our approach to address a new task that we call parameter advising: the problem of choosing values for alignment scoring function parameters from a given set of choices to maximize the accuracy of a computed alignment. For protein alignments, we consider twelve independent features that contribute to a quality alignment. An accuracy estimator is learned that is a polynomial function of these features; its coefficients are determined by minimizing its error with respect to true accuracy using mathematical optimization. Compared to prior approaches for estimating accuracy, our new approach (a) introduces novel feature functions that measure nonlocal properties of an alignment yet are fast to evaluate, (b) considers more general classes of estimators beyond linear combinations of features, and (c) develops new regression formulations for learning an estimator from examples; in addition, for parameter advising, we (d) determine the optimal parameter set of a given cardinality, which specifies the best parameter values from which to choose. Our estimator, which we call Facet (for “feature-based accuracy estimator”), yields a parameter advisor that on the hardest benchmarks provides more than a 27% improvement in accuracy over the best default parameter choice, and for parameter advising significantly outperforms the best prior approaches to assessing alignment quality. PMID:23489379

  15. Electronic evaluation for video commercials by impression index.

    PubMed

    Kong, Wanzeng; Zhao, Xinxin; Hu, Sanqing; Vecchiato, Giovanni; Babiloni, Fabio

    2013-12-01

    How to evaluate the effect of commercials is significantly important in neuromarketing. In this paper, we proposed an electronic way to evaluate the influence of video commercials on consumers by impression index. The impression index combines both the memorization and attention index during consumers observing video commercials by tracking the EEG activity. It extracts features from scalp EEG to evaluate the effectiveness of video commercials in terms of time-frequency-space domain. And, the general global field power was used as an impression index for evaluation of video commercial scenes as time series. Results of experiment demonstrate that the proposed approach is able to track variations of the cerebral activity related to cognitive task such as observing video commercials, and help to judge whether the scene in video commercials is impressive or not by EEG signals.

  16. Morphometry and subpopulation structure of Holstein bull spermatozoa: variations in ejaculates and cryopreservation straws

    PubMed Central

    Valverde, Anthony; Arenán, Héctor; Sancho, María; Contell, Jesús; Yániz, Jesús; Fernández, Alejandro; Soler, Carles

    2016-01-01

    Sperm quality is evaluated for the calculation of sperm dosage in artificial reproductive programs. The most common parameter used is motility, but morphology has a higher potential as a predictor of genetic quality. Morphometry calculations from CASA-Morph technology improve morphological evaluation and allow mathematical approaches to the problem. Semen from 28 Holstein bulls was collected by artificial vagina, and several ejaculates were studied. After general evaluation, samples were diluted, packaged in 0.25 ml straws, and stored in liquid nitrogen. Two straws per sample were thawed, and slides were processed and stained with Diff-Quik. Samples were analyzed by a CASA-Morph system for eight morphometric parameters. In addition to the “classical” statistical approach, based on variance analysis (revealing differences between animals, ejaculates, and straws), principal component (PC) analysis showed that the variables were grouped into PC1, related to size, and PC2 to shape. Subpopulation structure analysis showed four groups, namely, big, small, short, and narrow from their dominant characteristics, representing 31.0%, 27.3%, 24.1%, and 17.7% of the total population, respectively. The distributions varied between animals and ejaculates, but between straws, there were no differences in only four animals. This modern approach of considering an ejaculate sperm population as divided into subpopulations reflecting quantifiable parameters generated by CASA-Morph systems technology opens a new view on sperm function. This is the first study applying this approach to evaluate different ejaculates and straws from the same individual. More work must be done to improve seminal dose calculations in assisted reproductive programs. PMID:27678464

  17. Morphometry and subpopulation structure of Holstein bull spermatozoa: variations in ejaculates and cryopreservation straws.

    PubMed

    Valverde, Anthony; Arenán, Héctor; Sancho, María; Contell, Jesús; Yániz, Jesús; Fernández, Alejandro; Soler, Carles

    2016-01-01

    Sperm quality is evaluated for the calculation of sperm dosage in artificial reproductive programs. The most common parameter used is motility, but morphology has a higher potential as a predictor of genetic quality. Morphometry calculations from CASA-Morph technology improve morphological evaluation and allow mathematical approaches to the problem. Semen from 28 Holstein bulls was collected by artificial vagina, and several ejaculates were studied. After general evaluation, samples were diluted, packaged in 0.25 ml straws, and stored in liquid nitrogen. Two straws per sample were thawed, and slides were processed and stained with Diff-Quik. Samples were analyzed by a CASA-Morph system for eight morphometric parameters. In addition to the "classical" statistical approach, based on variance analysis (revealing differences between animals, ejaculates, and straws), principal component (PC) analysis showed that the variables were grouped into PC1, related to size, and PC2 to shape. Subpopulation structure analysis showed four groups, namely, big, small, short, and narrow from their dominant characteristics, representing 31.0%, 27.3%, 24.1%, and 17.7% of the total population, respectively. The distributions varied between animals and ejaculates, but between straws, there were no differences in only four animals. This modern approach of considering an ejaculate sperm population as divided into subpopulations reflecting quantifiable parameters generated by CASA-Morph systems technology opens a new view on sperm function. This is the first study applying this approach to evaluate different ejaculates and straws from the same individual. More work must be done to improve seminal dose calculations in assisted reproductive programs.

  18. A kernel regression approach to gene-gene interaction detection for case-control studies.

    PubMed

    Larson, Nicholas B; Schaid, Daniel J

    2013-11-01

    Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.

  19. Dealing with Actors and Compliance in Intervention Operations in a Non-permissive Hybrid Environment

    DTIC Science & Technology

    2009-12-01

    Patton, (2002). Qualitative research & evaluation methods (Vol. Third). California, London, New Delhi: Sage publications, Inc. 59 See these themes...non-compliant actors … … the study follows a question based approach with a general research question in the centre of interest:12 Which...1 Version 1.0 December 2009 Study authors: Col Dieter Muhr (AUT) Hon. Assoc. Prof. Dr. Andrea Riemer, Ph.D. (AUT) Objective

  20. Recruitment of general practices: Is a standardised approach helpful in the involvement of healthcare professionals in research?

    PubMed

    Riis, Allan; Jensen, Cathrine E; Maindal, Helle T; Bro, Flemming; Jensen, Martin B

    2016-01-01

    Health service research often involves the active participation of healthcare professionals. However, their ability and commitment to research varies. This can cause recruitment difficulties and thereby prolong the study period and inflate budgets. Solberg has identified seven R-factors as determinants for successfully recruiting healthcare professionals: relationships, reputation, requirements, rewards, reciprocity, resolution, and respect. This is a process evaluation of the seven R-factors. We applied these factors to guide the design of our recruitment strategy as well as to make adjustments when recruiting general practices in a guideline implementation study. In the guideline implementation study, we studied the effect of outreach visits, quality reports, and new patient stratification tools for low back pain patients. During a period of 15 months, we recruited 60 practices, which was fewer than planned (100 practices). In this evaluation, five of Solberg's seven R-factors were successfully addressed and two factors were not. The need to involve (reciprocity) end users in the development of new software and the amount of time needed to conduct recruitment (resolution) were underestimated. The framework of the seven R-factors was a feasible tool in our recruitment process. However, we suggest further investigation in developing systematic approaches to support the recruitment of healthcare professionals to research.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Huan; Yang, Xiu; Zheng, Bin

    Biomolecules exhibit conformational fluctuations near equilibrium states, inducing uncertainty in various biological properties in a dynamic way. We have developed a general method to quantify the uncertainty of target properties induced by conformational fluctuations. Using a generalized polynomial chaos (gPC) expansion, we construct a surrogate model of the target property with respect to varying conformational states. We also propose a method to increase the sparsity of the gPC expansion by defining a set of conformational “active space” random variables. With the increased sparsity, we employ the compressive sensing method to accurately construct the surrogate model. We demonstrate the performance ofmore » the surrogate model by evaluating fluctuation-induced uncertainty in solvent-accessible surface area for the bovine trypsin inhibitor protein system and show that the new approach offers more accurate statistical information than standard Monte Carlo approaches. Further more, the constructed surrogate model also enables us to directly evaluate the target property under various conformational states, yielding a more accurate response surface than standard sparse grid collocation methods. In particular, the new method provides higher accuracy in high-dimensional systems, such as biomolecules, where sparse grid performance is limited by the accuracy of the computed quantity of interest. Finally, our new framework is generalizable and can be used to investigate the uncertainty of a wide variety of target properties in biomolecular systems.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Huan; Yang, Xiu; Zheng, Bin

    Biomolecules exhibit conformational fluctuations near equilibrium states, inducing uncertainty in various biological properties in a dynamic way. We have developed a general method to quantify the uncertainty of target properties induced by conformational fluctuations. Using a generalized polynomial chaos (gPC) expansion, we construct a surrogate model of the target property with respect to varying conformational states. We also propose a method to increase the sparsity of the gPC expansion by defining a set of conformational “active space” random variables. With the increased sparsity, we employ the compressive sensing method to accurately construct the surrogate model. We demonstrate the performance ofmore » the surrogate model by evaluating fluctuation-induced uncertainty in solvent-accessible surface area for the bovine trypsin inhibitor protein system and show that the new approach offers more accurate statistical information than standard Monte Carlo approaches. Further more, the constructed surrogate model also enables us to directly evaluate the target property under various conformational states, yielding a more accurate response surface than standard sparse grid collocation methods. In particular, the new method provides higher accuracy in high-dimensional systems, such as biomolecules, where sparse grid performance is limited by the accuracy of the computed quantity of interest. Our new framework is generalizable and can be used to investigate the uncertainty of a wide variety of target properties in biomolecular systems.« less

  3. Impact of peer delivered wellness coaching.

    PubMed

    Swarbrick, Margaret; Gill, Kenneth J; Pratt, Carlos W

    2016-09-01

    People receiving publicly funded behavioral health services for severe mental disorders have shorter lifespans and significantly impaired health-related quality of life compared to the general population. The aim of this article was to explore how peer wellness coaching (PWC), a manualized approach to pursue specific physical wellness goals, impacted goal attainment and overall health related quality of life. Deidentified archival program evaluation data were examined to explore whether peer delivered wellness coaching had an impact on 33 service recipients with regard to goal attainment and health-related quality of life. Participants were served by 1 of 12 wellness coach trainees from a transformation transfer initiative grant who had been trained in the manualized approach. Coaching participants and their coaches reported significant progress toward the attainment of individually chosen goals, 2 to 4 weeks after establishing their goals. After 8 to 10 weeks of peer delivered wellness coaching, improvements were evident in the self-report of physical health, general health, and perceived health. These improvements were sustained 90 days later. PWC is potentially a promising practice for helping people choose and pursue individual goals and facilitating positive health and wellness changes. Rigorous controlled research with larger samples is needed to evaluate the benefits of peer delivered wellness coaching. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. Enhancing the quality and credibility of qualitative analysis.

    PubMed Central

    Patton, M Q

    1999-01-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems. PMID:10591279

  5. Computationally efficient confidence intervals for cross-validated area under the ROC curve estimates.

    PubMed

    LeDell, Erin; Petersen, Maya; van der Laan, Mark

    In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC.

  6. Condition monitoring of an electro-magnetic brake using an artificial neural network

    NASA Astrophysics Data System (ADS)

    Gofran, T.; Neugebauer, P.; Schramm, D.

    2017-10-01

    This paper presents a data-driven approach to Condition Monitoring of Electromagnetic brakes without use of additional sensors. For safe and efficient operation of electric motor a regular evaluation and replacement of the friction surface of the brake is required. One such evaluation method consists of direct or indirect sensing of the air-gap between pressure plate and magnet. A larger gap is generally indicative of worn surface(s). Traditionally this has been accomplished by the use of additional sensors - making existing systems complex, cost- sensitive and difficult to maintain. In this work a feed-forward Artificial Neural Network (ANN) is learned with the electrical data of the brake by supervised learning method to estimate the air-gap. The ANN model is optimized on the training set and validated using the test set. The experimental results of estimated air-gap with accuracy of over 95% demonstrate the validity of the proposed approach.

  7. A new approach in measuring graduate employability skills

    NASA Astrophysics Data System (ADS)

    Zakaria, Mohd Hafiz; Yatim, Bidin; Ismail, Suzilah

    2014-06-01

    Globalization makes graduate recruitment for an organization becomes more complex because employers believe that a holistic workforce is the key success of an organization. Currently, although graduates are said to possess specific skills but they still lack of employability skills, and this lead to increment of training cost either by government or even employers. Therefore, graduate level of employability skills should be evaluated before entering work market. In this study, a valid and reliable instrument embedding a new approach of measuring employability skills was developed using Situational Judgment Test (SJT). The instrument comprises of twelve (12) items measuring communication skill, professional ethics and morality, entrepreneurial skill, critical thinking in problem solving and personal quality. Instrument's validity was achieved through expert opinion and the reliability (in terms of stability) was based on the Chi-Square for homogeneity test. Generally, the instrument is beneficial to graduates, employers, government agencies, university, and workforce recruitment agencies when evaluating the level of employability skills.

  8. Interprofessional Education and Practice Guide No. 7: Development, implementation, and evaluation of a large-scale required interprofessional education foundational programme.

    PubMed

    Shrader, Sarah; Hodgkins, Renee; Laverentz, Delois; Zaudke, Jana; Waxman, Michael; Johnston, Kristy; Jernigan, Stephen

    2016-09-01

    Health profession educators and administrators are interested in how to develop an effective and sustainable interprofessional education (IPE) programme. We describe the approach used at the University of Kansas Medical Centre, Kansas City, United States. This approach is a foundational programme with multiple large-scale, half-day events each year. The programme is threaded with common curricular components that build in complexity over time and assures that each learner is exposed to IPE. In this guide, lessons learned and general principles related to the development of IPE programming are discussed. Important areas that educators should consider include curriculum development, engaging leadership, overcoming scheduling barriers, providing faculty development, piloting the programming, planning for logistical coordination, intentionally pairing IP facilitators, anticipating IP conflict, setting clear expectations for learners, publicising the programme, debriefing with faculty, planning for programme evaluation, and developing a scholarship and dissemination plan.

  9. Computationally efficient confidence intervals for cross-validated area under the ROC curve estimates

    PubMed Central

    Petersen, Maya; van der Laan, Mark

    2015-01-01

    In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC. PMID:26279737

  10. The impact of a national mental health arts and film festival on stigma and recovery.

    PubMed

    Quinn, N; Shulman, A; Knifton, L; Byrne, P

    2011-01-01

    This study aims to evaluate the impact of a national mental health arts festival for the general public, encompassing a wide variety of art forms and themes. An evaluation was undertaken with 415 attendees from 20 different events, combining qualitative and quantitative approaches. The findings demonstrate positive impact on the relationship between arts and mental health. Events increased positive attitudes, including positive representations of people's contributions, capabilities and potential to recover. They did not decrease negative attitudes. Intended behaviour change was modest and one film event increased audience perceptions of dangerousness. The paper argues that the arts can change stigma by constructing shared meanings and engaging audiences on an emotional level. Carefully programmed, collaborative, community-based arts festivals should form an integral part of national programmes to address stigma and to promote mental health and wellbeing, alongside traditional social marketing and public education approaches. © 2010 John Wiley & Sons A/S.

  11. Automating a human factors evaluation of graphical user interfaces for NASA applications: An update on CHIMES

    NASA Technical Reports Server (NTRS)

    Jiang, Jian-Ping; Murphy, Elizabeth D.; Bailin, Sidney C.; Truszkowski, Walter F.

    1993-01-01

    Capturing human factors knowledge about the design of graphical user interfaces (GUI's) and applying this knowledge on-line are the primary objectives of the Computer-Human Interaction Models (CHIMES) project. The current CHIMES prototype is designed to check a GUI's compliance with industry-standard guidelines, general human factors guidelines, and human factors recommendations on color usage. Following the evaluation, CHIMES presents human factors feedback and advice to the GUI designer. The paper describes the approach to modeling human factors guidelines, the system architecture, a new method developed to convert quantitative RGB primaries into qualitative color representations, and the potential for integrating CHIMES with user interface management systems (UIMS). Both the conceptual approach and its implementation are discussed. This paper updates the presentation on CHIMES at the first International Symposium on Ground Data Systems for Spacecraft Control.

  12. When East meets the West: differences in approach and mentality between old and new EU member countries

    NASA Astrophysics Data System (ADS)

    Matenco, Liviu

    2017-04-01

    There are marked differences in the management and system of research across various EU member countries. While older member states have gradually learn their differences in research systems, approach and mentality, they generally know very little on marked differences with EU13 that have a very strong impact in the design, organisation, evaluation and implementation of collaborative or European programmes. Such differences result from the organisation of the national research system, funding agencies, supervision and methods of evaluation. In particular the latter impose a markedly different career management by the promotion of local or regional journals, institutes and faculty organisation, system of promotion and management of research. In this contribution, I share my experiences in crossing the east-west mentality border, in particular discussing pre-conceived mentalities, underlying differences and suggesting better ways of promoting integrated research across European systems.

  13. Intelligent data management for real-time spacecraft monitoring

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.; Gasser, Les; Abramson, Bruce

    1992-01-01

    Real-time AI systems have begun to address the challenge of restructuring problem solving to meet real-time constraints by making key trade-offs that pursue less than optimal strategies with minimal impact on system goals. Several approaches for adapting to dynamic changes in system operating conditions are known. However, simultaneously adapting system decision criteria in a principled way has been difficult. Towards this end, a general technique for dynamically making such trade-offs using a combination of decision theory and domain knowledge has been developed. Multi-attribute utility theory (MAUT), a decision theoretic approach for making one-time decisions is discussed and dynamic trade-off evaluation is described as a knowledge-based extension of MAUT that is suitable for highly dynamic real-time environments, and provides an example of dynamic trade-off evaluation applied to a specific data management trade-off in a real-world spacecraft monitoring application.

  14. Sensitivity evaluation of dynamic speckle activity measurements using clustering methods.

    PubMed

    Etchepareborda, Pablo; Federico, Alejandro; Kaufmann, Guillermo H

    2010-07-01

    We evaluate and compare the use of competitive neural networks, self-organizing maps, the expectation-maximization algorithm, K-means, and fuzzy C-means techniques as partitional clustering methods, when the sensitivity of the activity measurement of dynamic speckle images needs to be improved. The temporal history of the acquired intensity generated by each pixel is analyzed in a wavelet decomposition framework, and it is shown that the mean energy of its corresponding wavelet coefficients provides a suited feature space for clustering purposes. The sensitivity obtained by using the evaluated clustering techniques is also compared with the well-known methods of Konishi-Fujii, weighted generalized differences, and wavelet entropy. The performance of the partitional clustering approach is evaluated using simulated dynamic speckle patterns and also experimental data.

  15. A note on evaluating VAN earthquake predictions

    NASA Astrophysics Data System (ADS)

    Tselentis, G.-Akis; Melis, Nicos S.

    The evaluation of the success level of an earthquake prediction method should not be based on approaches that apply generalized strict statistical laws and avoid the specific nature of the earthquake phenomenon. Fault rupture processes cannot be compared to gambling processes. The outcome of the present note is that even an ideal earthquake prediction method is still shown to be a matter of a “chancy” association between precursors and earthquakes if we apply the same procedure proposed by Mulargia and Gasperini [1992] in evaluating VAN earthquake predictions. Each individual VAN prediction has to be evaluated separately, taking always into account the specific circumstances and information available. The success level of epicenter prediction should depend on the earthquake magnitude, and magnitude and time predictions may depend on earthquake clustering and the tectonic regime respectively.

  16. Evaluating performance of stormwater sampling approaches using a dynamic watershed model.

    PubMed

    Ackerman, Drew; Stein, Eric D; Ritter, Kerry J

    2011-09-01

    Accurate quantification of stormwater pollutant levels is essential for estimating overall contaminant discharge to receiving waters. Numerous sampling approaches exist that attempt to balance accuracy against the costs associated with the sampling method. This study employs a novel and practical approach of evaluating the accuracy of different stormwater monitoring methodologies using stormflows and constituent concentrations produced by a fully validated continuous simulation watershed model. A major advantage of using a watershed model to simulate pollutant concentrations is that a large number of storms representing a broad range of conditions can be applied in testing the various sampling approaches. Seventy-eight distinct methodologies were evaluated by "virtual samplings" of 166 simulated storms of varying size, intensity and duration, representing 14 years of storms in Ballona Creek near Los Angeles, California. The 78 methods can be grouped into four general strategies: volume-paced compositing, time-paced compositing, pollutograph sampling, and microsampling. The performances of each sampling strategy was evaluated by comparing the (1) median relative error between the virtually sampled and the true modeled event mean concentration (EMC) of each storm (accuracy), (2) median absolute deviation about the median or "MAD" of the relative error or (precision), and (3) the percentage of storms where sampling methods were within 10% of the true EMC (combined measures of accuracy and precision). Finally, costs associated with site setup, sampling, and laboratory analysis were estimated for each method. Pollutograph sampling consistently outperformed the other three methods both in terms of accuracy and precision, but was the most costly method evaluated. Time-paced sampling consistently underestimated while volume-paced sampling over estimated the storm EMCs. Microsampling performance approached that of pollutograph sampling at a substantial cost savings. The most efficient method for routine stormwater monitoring in terms of a balance between performance and cost was volume-paced microsampling, with variable sample pacing to ensure that the entirety of the storm was captured. Pollutograph sampling is recommended if the data are to be used for detailed analysis of runoff dynamics.

  17. The relationship between nurse staffing and failure to rescue: where does it matter most?

    PubMed

    Talsma, AkkeNeel; Jones, Katherine; Guo, Ying; Wilson, Deleise; Campbell, Darrell A

    2014-09-01

    This study further expands on the relationship between nurse staffing levels and patient outcomes, in particular, failure to rescue. Many studies are based on single-site hospitals or single-year data, thus limiting the generalizations of the findings. The purpose was to evaluate in a multisite multiyear study the relationship between unit-level nurse staffing and FTR mortality, for ICU and non-ICU patients. Using administrative and actual unit level nurse staffing data, we used AHRQ 2003 Patient Safety Indicator (2003) software and matched those with the patient's discharge month. Fixed effects multilevel logistic analyses were used to take into account the hierarchical structure of the database and patient clustering within units. We controlled for patient demographics, clinical conditions, and CCS categories. The majority (94%) of cases were discharged from general care units, ICUs reported higher nurse staffing levels based on patient complexity. Expired cases were 3 years older, male, and nonwhite. For general care discharges, the relationship between RN level HPPD approached significance (P = 0.07), suggesting increased odds of higher FTR mortality with higher staffing levels. We did not observe any of the expected associations between the nurse staffing variables and FTR for either general care unit or ICU discharges. The comprehensive risk adjustments provided adequate "leveling of the playing field" to evaluate the impact of unit-based nurse staffing levels on FTR mortality. Future studies should evaluate the influence of unit environment and patient risk.

  18. Hotspot-Centric De Novo Design of Protein Binders

    PubMed Central

    Fleishman, Sarel J.; Corn, Jacob E.; Strauch, Eva-Maria; Whitehead, Timothy A.; Karanicolas, John; Baker, David

    2014-01-01

    Protein–protein interactions play critical roles in biology, and computational design of interactions could be useful in a range of applications. We describe in detail a general approach to de novo design of protein interactions based on computed, energetically optimized interaction hotspots, which was recently used to produce high-affinity binders of influenza hemagglutinin. We present several alternative approaches to identify and build the key hotspot interactions within both core secondary structural elements and variable loop regions and evaluate the method's performance in natural-interface recapitulation. We show that the method generates binding surfaces that are more conformationally restricted than previous design methods, reducing opportunities for off-target interactions. PMID:21945116

  19. Reply to ‘Comment on “On the Clausius equality and inequality”’

    NASA Astrophysics Data System (ADS)

    Anacleto, Joaquim; Pereira, Mário G.; Ferreira, J. M.

    2013-01-01

    We address Bizarro's comment on a paper by Anacleto (2011 Eur. J. Phys. 32 279). Bizarro claims that (i) Anacleto's approach is either incomplete or incorrect; (ii) one problem is the definition of dissipative work; and (iii) additional ambiguities and misconceptions may stem from his explanations. We contend that (i) both authors present exactly the same definition of dissipative work; and (ii) it is possible to obtain a more general expression to evaluate the entropy change that comprises the expressions developed by both authors—indicating that Anacleto's approach is correct and coherent, and that the criticism of the paper is therefore unfounded.

  20. Activity-based costing and its application in a Turkish university hospital.

    PubMed

    Yereli, Ayşe Necef

    2009-03-01

    Resource management in hospitals is of increasing importance in today's global economy. Traditional accounting systems have become inadequate for managing hospital resources and accurately determining service costs. Conversely, the activity-based costing approach to hospital accounting is an effective cost management model that determines costs and evaluates financial performance across departments. Obtaining costs that are more accurate can enable hospitals to analyze and interpret costing decisions and make more accurate budgeting decisions. Traditional and activity-based costing approaches were compared using a cost analysis of gall bladder surgeries in the general surgery department of one university hospital in Manisa, Turkey. Copyright (c) AORN, Inc, 2009.

  1. Dynamic resource allocation in a hierarchical multiprocessor system: A preliminary study

    NASA Technical Reports Server (NTRS)

    Ngai, Tin-Fook

    1986-01-01

    An integrated system approach to dynamic resource allocation is proposed. Some of the problems in dynamic resource allocation and the relationship of these problems to system structures are examined. A general dynamic resource allocation scheme is presented. A hierarchial system architecture which dynamically maps between processor structure and programs at multiple levels of instantiations is described. Simulation experiments were conducted to study dynamic resource allocation on the proposed system. Preliminary evaluation based on simple dynamic resource allocation algorithms indicates that with the proposed system approach, the complexity of dynamic resource management could be significantly reduced while achieving reasonable effective dynamic resource allocation.

  2. Fuzzy-Rough Nearest Neighbour Classification

    NASA Astrophysics Data System (ADS)

    Jensen, Richard; Cornelis, Chris

    A new fuzzy-rough nearest neighbour (FRNN) classification algorithm is presented in this paper, as an alternative to Sarkar's fuzzy-rough ownership function (FRNN-O) approach. By contrast to the latter, our method uses the nearest neighbours to construct lower and upper approximations of decision classes, and classifies test instances based on their membership to these approximations. In the experimental analysis, we evaluate our approach with both classical fuzzy-rough approximations (based on an implicator and a t-norm), as well as with the recently introduced vaguely quantified rough sets. Preliminary results are very good, and in general FRNN outperforms FRNN-O, as well as the traditional fuzzy nearest neighbour (FNN) algorithm.

  3. The forensic psychiatric report.

    PubMed

    Norko, Michael A; Buchanan, Mar Alec

    2015-01-01

    The construction of a written forensic report is a core component of forensic practice, demonstrating the evaluator's skill in conducting the evaluation and in communicating relevant information to the legal audience in an effective manner. Although communication skills and quality of written documentation are important in clinical psychiatry generally, they form the sine qua non of successful forensic work, which consists in telling complex stories in a coherent and compelling fashion. High quality forensic reports require careful preparation from the earliest stages of work on a case. They generally follow an expected structure, which permits the evaluator to provide all the data necessary to form a carefully reasoned opinion that addresses the legal questions posed. Formats and content of reports vary according to the type of case and the circumstances of the evaluation and so require flexibility within customary frameworks. The style and quality of writing are critical to the crafting of forensic reports. The effects on legal decision-makers of various approaches to the presentation of information in reports has not been studied empirically, but guidance from experienced forensic psychiatrists is available. There is a small body of research on quality improvement in forensic writing, and further empiric study is warranted.

  4. Phase Transitions in Planning Problems: Design and Analysis of Parameterized Families of Hard Planning Problems

    NASA Technical Reports Server (NTRS)

    Hen, Itay; Rieffel, Eleanor G.; Do, Minh; Venturelli, Davide

    2014-01-01

    There are two common ways to evaluate algorithms: performance on benchmark problems derived from real applications and analysis of performance on parametrized families of problems. The two approaches complement each other, each having its advantages and disadvantages. The planning community has concentrated on the first approach, with few ways of generating parametrized families of hard problems known prior to this work. Our group's main interest is in comparing approaches to solving planning problems using a novel type of computational device - a quantum annealer - to existing state-of-the-art planning algorithms. Because only small-scale quantum annealers are available, we must compare on small problem sizes. Small problems are primarily useful for comparison only if they are instances of parametrized families of problems for which scaling analysis can be done. In this technical report, we discuss our approach to the generation of hard planning problems from classes of well-studied NP-complete problems that map naturally to planning problems or to aspects of planning problems that many practical planning problems share. These problem classes exhibit a phase transition between easy-to-solve and easy-to-show-unsolvable planning problems. The parametrized families of hard planning problems lie at the phase transition. The exponential scaling of hardness with problem size is apparent in these families even at very small problem sizes, thus enabling us to characterize even very small problems as hard. The families we developed will prove generally useful to the planning community in analyzing the performance of planning algorithms, providing a complementary approach to existing evaluation methods. We illustrate the hardness of these problems and their scaling with results on four state-of-the-art planners, observing significant differences between these planners on these problem families. Finally, we describe two general, and quite different, mappings of planning problems to QUBOs, the form of input required for a quantum annealing machine such as the D-Wave II.

  5. Operational Soil Moisture Retrieval Techniques: Theoretical Comparisons in the Context of Improving the NASA Standard Approach

    NASA Astrophysics Data System (ADS)

    Mladenova, I. E.; Jackson, T. J.; Bindlish, R.; Njoku, E. G.; Chan, S.; Cosh, M. H.

    2012-12-01

    We are currently evaluating potential improvements to the standard NASA global soil moisture product derived using observations acquired from the Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E). A major component of this effort is a thorough review of the theoretical basis of available passive-based soil moisture retrieval algorithms suitable for operational implementation. Several agencies provide routine soil moisture products. Our research focuses on five well-establish techniques that are capable of carrying out global retrieval using the same AMSR-E data set as the NASA approach (i.e. X-band brightness temperature data). In general, most passive-based algorithms include two major components: radiative transfer modeling, which provides the smooth surface reflectivity properties of the soil surface, and a complex dielectric constant model of the soil-water mixture. These two components are related through the Fresnel reflectivity equations. Furthermore, the land surface temperature, vegetation, roughness and soil properties need to be adequately accounted for in the radiative transfer and dielectric modeling. All of the available approaches we have examined follow the general data processing flow described above, however, the actual solutions as well as the final products can be very different. This is primarily a result of the assumptions, number of sensor variables utilized, the selected ancillary data sets and approaches used to account for the effect of the additional geophysical variables impacting the measured signal. The operational NASA AMSR-E-based retrievals have been shown to have a dampened temporal response and sensitivity range. Two possible approaches to addressing these issues are being evaluated: enhancing the theoretical basis of the existing algorithm, if feasible, or directly adjusting the dynamic range of the final soil moisture product. Both of these aspects are being actively investigated and will be discussed in our talk. Improving the quality and reliability of the global soil moisture product would result in greater acceptance and utilization in the related applications. USDA is an equal opportunity provider and employer.

  6. Complex evaluation of the loft-style of retrivation as a type of building conversion

    NASA Astrophysics Data System (ADS)

    Chulkov, V.; Kazaryan, R.; Kuzina, O.; Maloyan, G.; Efimenko, A.

    2017-10-01

    Construction reorganization is part of a basic reorganization cycle in which four phases (phases, states, technological redistribution) are consistently implemented: device, disorganization, reorganization and co-organization. The field of our research lies in the phase of reconstruction. One of the varieties of building reconstruction is retrieval (from English retrieve - to restore, find) - bringing the reorganized object into a working state by attaching to the old functional building system of the new system. Retraining provides the ability to replace elements of the new system locally or in general (implementing the principle of “assembly-disassembly”) and provides for the elimination of the moral deterioration of the building and the normal operation of the facility. In the construction and transport industry there is a sufficiently large number of multiparameter tasks that require a systematic approach and the definition of a single integrated indicator of the effectiveness of the operation. These tasks can be solved using a variety of approaches. One of such approaches, as the method of integral evaluation based on stellar infographic models, is considered.

  7. Forecasting the probability of future groundwater levels declining below specified low thresholds in the conterminous U.S.

    USGS Publications Warehouse

    Dudley, Robert W.; Hodgkins, Glenn A.; Dickinson, Jesse

    2017-01-01

    We present a logistic regression approach for forecasting the probability of future groundwater levels declining or maintaining below specific groundwater-level thresholds. We tested our approach on 102 groundwater wells in different climatic regions and aquifers of the United States that are part of the U.S. Geological Survey Groundwater Climate Response Network. We evaluated the importance of current groundwater levels, precipitation, streamflow, seasonal variability, Palmer Drought Severity Index, and atmosphere/ocean indices for developing the logistic regression equations. Several diagnostics of model fit were used to evaluate the regression equations, including testing of autocorrelation of residuals, goodness-of-fit metrics, and bootstrap validation testing. The probabilistic predictions were most successful at wells with high persistence (low month-to-month variability) in their groundwater records and at wells where the groundwater level remained below the defined low threshold for sustained periods (generally three months or longer). The model fit was weakest at wells with strong seasonal variability in levels and with shorter duration low-threshold events. We identified challenges in deriving probabilistic-forecasting models and possible approaches for addressing those challenges.

  8. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    DOE PAGES

    Higdon, Dave; McDonnell, Jordan D.; Schunck, Nicolas; ...

    2015-02-05

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based modelmore » $$\\eta (\\theta )$$, where θ denotes the uncertain, best input setting. Hence the statistical model is of the form $$y=\\eta (\\theta )+\\epsilon ,$$ where $$\\epsilon $$ accounts for measurement, and possibly other, error sources. When nonlinearity is present in $$\\eta (\\cdot )$$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model $$\\eta (\\cdot )$$. This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. Lastly, we also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory.« less

  9. Sensory approaches in mental health: A scoping review.

    PubMed

    Scanlan, Justin Newton; Novak, Theresa

    2015-10-01

    Sensory approaches in mental health are designed to assist consumers to regulate physiological and emotional arousal. They have been highlighted as non-invasive, self-directed and empowering interventions that may support recovery-oriented and trauma-informed mental health practice and may assist in efforts to reduce the use of seclusion and restraint. Over recent years, there has been a substantial increase in research in this area. However, there has not yet been any attempt to map and summarise this literature. A five-stage scoping review was conducted. Four databases were searched for literature evaluating sensory interventions implemented in mental health settings. A total of 17 studies were included in the final review. A range of sensory approaches was evaluated and a range of outcomes measured. In general, consumers reported reductions in distress associated with engaging in sensory interventions. Results in terms of reduction of seclusion and restraint were mixed, with some studies reporting a decrease, others reporting no change and one reporting an increase. Methodological limitations in the studies reviewed mean that results should be interpreted with caution. Although there is emerging evidence for the usefulness of sensory approaches in supporting consumers' self-management of distress, there is less evidence for sensory approaches supporting reductions in seclusion and restraint when used in isolation. More research is necessary, but sensory approaches do appear safe and effective. Services wishing to reduce seclusion and restraint should implement sensory approaches in conjunction with other strategies to achieve this important outcome. © 2015 Occupational Therapy Australia.

  10. The Fragility of Individual-Based Explanations of Social Hierarchies: A Test Using Animal Pecking Orders

    PubMed Central

    2016-01-01

    The standard approach in accounting for hierarchical differentiation in biology and the social sciences considers a hierarchy as a static distribution of individuals possessing differing amounts of some valued commodity, assumes that the hierarchy is generated by micro-level processes involving individuals, and attempts to reverse engineer the processes that produced the hierarchy. However, sufficient experimental and analytical results are available to evaluate this standard approach in the case of animal dominance hierarchies (pecking orders). Our evaluation using evidence from hierarchy formation in small groups of both hens and cichlid fish reveals significant deficiencies in the three tenets of the standard approach in accounting for the organization of dominance hierarchies. In consequence, we suggest that a new approach is needed to explain the organization of pecking orders and, very possibly, by implication, for other kinds of social hierarchies. We develop an example of such an approach that considers dominance hierarchies to be dynamic networks, uses dynamic sequences of interaction (dynamic network motifs) to explain the organization of dominance hierarchies, and derives these dynamic sequences directly from observation of hierarchy formation. We test this dynamical explanation using computer simulation and find a good fit with actual dynamics of hierarchy formation in small groups of hens. We hypothesize that the same dynamic sequences are used in small groups of many other animal species forming pecking orders, and we discuss the data required to evaluate our hypothesis. Finally, we briefly consider how our dynamic approach may be generalized to other kinds of social hierarchies using the example of the distribution of empty gastropod (snail) shells occupied in populations of hermit crabs. PMID:27410230

  11. Further evaluation of traditional icing scaling methods

    NASA Technical Reports Server (NTRS)

    Anderson, David N.

    1996-01-01

    This report provides additional evaluations of two methods to scale icing test conditions; it also describes a hybrid technique for use when scaled conditions are outside the operating envelope of the test facility. The first evaluation is of the Olsen method which can be used to scale the liquid-water content in icing tests, and the second is the AEDC (Ruff) method which is used when the test model is less than full size. Equations for both scaling methods are presented in the paper, and the methods were evaluated by performing icing tests in the NASA Lewis Icing Research Tunnel (IRT). The Olsen method was tested using 53 cm diameter NACA 0012 airfoils. Tests covered liquid-water-contents which varied by as much as a factor of 1.8. The Olsen method was generally effective in giving scale ice shapes which matched the reference shapes for these tests. The AEDC method was tested with NACA 0012 airfoils with chords from 18 cm to 53 cm. The 53 cm chord airfoils were used in reference tests, and 1/2 and 1/3 scale tests were made at conditions determined by applying the AEDC scaling method. The scale and reference airspeeds were matched in these tests. The AEDC method was found to provide fairly effective scaling for 1/2 size tests, but for 1/3 size models, scaling was generally less effective. In addition to these two scaling methods, a hybrid approach was also tested in which the Olsen method was used to adjust the LWC after size was scaled using the constant Weber number method. This approach was found to be an effective way to test when scaled conditions would otherwise be outside the capability of the test facility.

  12. Adiabatic corrections to density functional theory energies and wave functions.

    PubMed

    Mohallem, José R; Coura, Thiago de O; Diniz, Leonardo G; de Castro, Gustavo; Assafrão, Denise; Heine, Thomas

    2008-09-25

    The adiabatic finite-nuclear-mass-correction (FNMC) to the electronic energies and wave functions of atoms and molecules is formulated for density-functional theory and implemented in the deMon code. The approach is tested for a series of local and gradient corrected density functionals, using MP2 results and diagonal-Born-Oppenheimer corrections from the literature for comparison. In the evaluation of absolute energy corrections of nonorganic molecules the LDA PZ81 functional works surprisingly better than the others. For organic molecules the GGA BLYP functional has the best performance. FNMC with GGA functionals, mainly BLYP, show a good performance in the evaluation of relative corrections, except for nonorganic molecules containing H atoms. The PW86 functional stands out with the best evaluation of the barrier of linearity of H2O and the isotopic dipole moment of HDO. In general, DFT functionals display an accuracy superior than the common belief and because the corrections are based on a change of the electronic kinetic energy they are here ranked in a new appropriate way. The approach is applied to obtain the adiabatic correction for full atomization of alcanes C(n)H(2n+2), n = 4-10. The barrier of 1 mHartree is approached for adiabatic corrections, justifying its insertion into DFT.

  13. Knowledge translation on dementia: a cluster randomized trial to compare a blended learning approach with a "classical" advanced training in GP quality circles

    PubMed Central

    Vollmar, Horst C; Butzlaff, Martin E; Lefering, Rolf; Rieger, Monika A

    2007-01-01

    Background Thus far important findings regarding the dementia syndrome have been implemented into patients' medical care only inadequately. A professional training accounting for both, general practitioners' (GP) needs and learning preferences as well as care-relevant aspects could be a major step towards improving medical care. In the WIDA-study, entitled "Knowledge translation on dementia in general practice" two different training concepts are developed, implemented and evaluated. Both concepts are building on an evidence-based, GP-related dementia guideline and communicate the guideline's essential insights. Methods/Design Both development and implementation emphasize a procedure that is well-accepted in practice and, thus, can achieve a high degree of external validity. This is particularly guaranteed through the preparation of training material and the fact that general practitioners' quality circles (QC) are addressed. The evaluation of the two training concepts is carried out by comparing two groups of GPs to which several quality circles have been randomly assigned. The primary outcome is the GPs' knowledge gain. Secondary outcomes are designed to indicate the training's potential effects on the GPs' practical actions. In the first training concept (study arm A) GPs participate in a structured case discussion prepared for by internet-based learning material ("blended-learning" approach). The second training concept (study arm B) relies on frontal medical training in the form of a slide presentation and follow-up discussion ("classical" approach). Discussion This paper presents the outline of a cluster-randomized trial which has been peer reviewed and support by a national funding organization – Federal Ministry of Education and Research (BMBF) – and is approved by an ethics commission. The data collection has started in August 2006 and the results will be published independently of the study's outcome. Trial Registration Current Controlled Trials [ISRCTN36550981] PMID:17587452

  14. Two aspects of black hole entropy in Lanczos-Lovelock models of gravity

    NASA Astrophysics Data System (ADS)

    Kolekar, Sanved; Kothawala, Dawood; Padmanabhan, T.

    2012-03-01

    We consider two specific approaches to evaluate the black hole entropy which are known to produce correct results in the case of Einstein’s theory and generalize them to Lanczos-Lovelock models. In the first approach (which could be called extrinsic), we use a procedure motivated by earlier work by Pretorius, Vollick, and Israel, and by Oppenheim, and evaluate the entropy of a configuration of densely packed gravitating shells on the verge of forming a black hole in Lanczos-Lovelock theories of gravity. We find that this matter entropy is not equal to (it is less than) Wald entropy, except in the case of Einstein theory, where they are equal. The matter entropy is proportional to the Wald entropy if we consider a specific mth-order Lanczos-Lovelock model, with the proportionality constant depending on the spacetime dimensions D and the order m of the Lanczos-Lovelock theory as (D-2m)/(D-2). Since the proportionality constant depends on m, the proportionality between matter entropy and Wald entropy breaks down when we consider a sum of Lanczos-Lovelock actions involving different m. In the second approach (which could be called intrinsic), we generalize a procedure, previously introduced by Padmanabhan in the context of general relativity, to study off-shell entropy of a class of metrics with horizon using a path integral method. We consider the Euclidean action of Lanczos-Lovelock models for a class of metrics off shell and interpret it as a partition function. We show that in the case of spherically symmetric metrics, one can interpret the Euclidean action as the free energy and read off both the entropy and energy of a black hole spacetime. Surprisingly enough, this leads to exactly the Wald entropy and the energy of the spacetime in Lanczos-Lovelock models obtained by other methods. We comment on possible implications of the result.

  15. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    PubMed

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  16. Exploring improvements in patient logistics in Dutch hospitals with a survey.

    PubMed

    van Lent, Wineke A M; Sanders, E Marloes; van Harten, Wim H

    2012-08-01

    Research showed that promising approaches such as benchmarking, operations research, lean management and six sigma, could be adopted to improve patient logistics in healthcare. To our knowledge, little research has been conducted to obtain an overview on the use, combination and effects of approaches to improve patient logistics in hospitals. We therefore examined the approaches and tools used to improve patient logistics in Dutch hospitals, the reported effects of these approaches on performance, the applied support structure and the methods used to evaluate the effects. A survey among experts on patient logistics in 94 Dutch hospitals. The survey data were analysed using cross tables. Forty-eight percent of all hospitals participated. Ninety-eight percent reported to have used multiple approaches, 39% of them used five or more approaches. Care pathways were the preferred approach by 43% of the hospitals, followed by business process re-engineering and lean six sigma (both 13%). Flowcharts were the most commonly used tool, they were used on a regular basis by 94% of the hospitals. Less than 10% of the hospitals used data envelopment analysis and critical path analysis on a regular basis. Most hospitals (68%) relied on external support for process analyses and education on patient logistics, only 24% had permanent internal training programs on patient logistics. Approximately 50% of the hospitals that evaluated the effects of approaches on efficiency, throughput times and financial results, reported that they had accomplished their goals. Goal accomplishment in general hospitals ranged from 63% to 67%, in academic teaching hospitals from 0% to 50%, and in teaching hospitals from 25% to 44%. More than 86% performed an evaluation, 53% performed a post-intervention measurement. Patient logistics appeared to be a rather new subject as most hospitals had not selected a single approach, they relied on external support and they did not have permanent training programs. Hospitals used a combination of approaches and tools, about half of the hospitals reported goal accomplishment and no approach seemed to outperform the others. To make improvement efforts more successful, research should be conducted into the selection and application of approaches, their contingency factors, and goal-setting procedures.

  17. Exploring improvements in patient logistics in Dutch hospitals with a survey

    PubMed Central

    2012-01-01

    Background Research showed that promising approaches such as benchmarking, operations research, lean management and six sigma, could be adopted to improve patient logistics in healthcare. To our knowledge, little research has been conducted to obtain an overview on the use, combination and effects of approaches to improve patient logistics in hospitals. We therefore examined the approaches and tools used to improve patient logistics in Dutch hospitals, the reported effects of these approaches on performance, the applied support structure and the methods used to evaluate the effects. Methods A survey among experts on patient logistics in 94 Dutch hospitals. The survey data were analysed using cross tables. Results Forty-eight percent of all hospitals participated. Ninety-eight percent reported to have used multiple approaches, 39% of them used five or more approaches. Care pathways were the preferred approach by 43% of the hospitals, followed by business process re-engineering and lean six sigma (both 13%). Flowcharts were the most commonly used tool, they were used on a regular basis by 94% of the hospitals. Less than 10% of the hospitals used data envelopment analysis and critical path analysis on a regular basis. Most hospitals (68%) relied on external support for process analyses and education on patient logistics, only 24% had permanent internal training programs on patient logistics. Approximately 50% of the hospitals that evaluated the effects of approaches on efficiency, throughput times and financial results, reported that they had accomplished their goals. Goal accomplishment in general hospitals ranged from 63% to 67%, in academic teaching hospitals from 0% to 50%, and in teaching hospitals from 25% to 44%. More than 86% performed an evaluation, 53% performed a post-intervention measurement. Conclusions Patient logistics appeared to be a rather new subject as most hospitals had not selected a single approach, they relied on external support and they did not have permanent training programs. Hospitals used a combination of approaches and tools, about half of the hospitals reported goal accomplishment and no approach seemed to outperform the others. To make improvement efforts more successful, research should be conducted into the selection and application of approaches, their contingency factors, and goal-setting procedures. PMID:22852880

  18. Evaluating impacts using a BACI design, ratios, and a Bayesian approach with a focus on restoration.

    PubMed

    Conner, Mary M; Saunders, W Carl; Bouwes, Nicolaas; Jordan, Chris

    2015-10-01

    Before-after-control-impact (BACI) designs are an effective method to evaluate natural and human-induced perturbations on ecological variables when treatment sites cannot be randomly chosen. While effect sizes of interest can be tested with frequentist methods, using Bayesian Markov chain Monte Carlo (MCMC) sampling methods, probabilities of effect sizes, such as a ≥20 % increase in density after restoration, can be directly estimated. Although BACI and Bayesian methods are used widely for assessing natural and human-induced impacts for field experiments, the application of hierarchal Bayesian modeling with MCMC sampling to BACI designs is less common. Here, we combine these approaches and extend the typical presentation of results with an easy to interpret ratio, which provides an answer to the main study question-"How much impact did a management action or natural perturbation have?" As an example of this approach, we evaluate the impact of a restoration project, which implemented beaver dam analogs, on survival and density of juvenile steelhead. Results indicated the probabilities of a ≥30 % increase were high for survival and density after the dams were installed, 0.88 and 0.99, respectively, while probabilities for a higher increase of ≥50 % were variable, 0.17 and 0.82, respectively. This approach demonstrates a useful extension of Bayesian methods that can easily be generalized to other study designs from simple (e.g., single factor ANOVA, paired t test) to more complicated block designs (e.g., crossover, split-plot). This approach is valuable for estimating the probabilities of restoration impacts or other management actions.

  19. Ash deposits - Initiating the change from empiricism to generic engineering. Part 1: The generic approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagoner, C.L.; Wessel, R.A.

    1986-01-01

    Empiricism has traditionally been used to relate laboratory and pilot-scale measurements of fuel characteristics with the design, performance, and the slagging and fouling behavior of steam generators. Currently, a new engineering approach is being evaluated. The goal is to develop and use calculations and measurements from several engineering disciplines that exceed the demonstrated limitations of present empirical techniques for predicting slagging/fouling behavior. In Part I of this paper, the generic approach to deposits and boiler performance is defined and a matrix of engineering concepts is described. General relationships are presented for assessing the effects of deposits and sootblowing on themore » real-time performance of heat transfer surfaces in pilot- and commercial-scale steam generators.« less

  20. Personality and culture: demarcating between the common and the unique.

    PubMed

    Poortinga, Y H; Van Hemert, D A

    2001-12-01

    Four traditions in research on personality and culture are distinguished: (i) the culture-and-personality school and recent relativistic perspectives, (ii) the trait approach, (iii) interactionistic orientations, and (iv) situationist approaches. Next, the first two of these traditions are evaluated to ascertain how much variance is explained by culture. Thereafter, it is argued that the (questionable) focus on explanations with a high level of inclusiveness or generality is a major reason for the near absence of situationist interpretation of cross-cultural differences. Finally, three possible strategies are discussed to bridge the gap between relativism (emphasizing differences) and universalism (assuming basic similarities). A suggestion is made as to how both approaches can be valuable when unexplainable, as well as explainable variances, in cross-cultural personality research are taken seriously.

Top