Sample records for case method approach

  1. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  2. A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.

    2003-01-01

    In this paper we present, a comparison of trajectory optimization approaches for the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP). Quasi- Newton and Nelder-Meade Simplex. Several cost function parameterizations are considered for the direct approach. We choose one direct approach that appears to be the most flexible. Both the direct and indirect methods are applied to a variety of test cases which are chosen to demonstrate the performance of each method in different flight regimes. The first test case is a simple circular-to-circular coplanar rendezvous. The second test case is an elliptic-to-elliptic line of apsides rotation. The final test case is an orbit phasing maneuver sequence in a highly elliptic orbit. For each test case we present a comparison of the performance of all methods we consider in this paper.

  3. Methodology or method? A critical review of qualitative case study reports.

    PubMed

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.

  4. Multiple imputation of missing data in nested case-control and case-cohort studies.

    PubMed

    Keogh, Ruth H; Seaman, Shaun R; Bartlett, Jonathan W; Wood, Angela M

    2018-06-05

    The nested case-control and case-cohort designs are two main approaches for carrying out a substudy within a prospective cohort. This article adapts multiple imputation (MI) methods for handling missing covariates in full-cohort studies for nested case-control and case-cohort studies. We consider data missing by design and data missing by chance. MI analyses that make use of full-cohort data and MI analyses based on substudy data only are described, alongside an intermediate approach in which the imputation uses full-cohort data but the analysis uses only the substudy. We describe adaptations to two imputation methods: the approximate method (MI-approx) of White and Royston () and the "substantive model compatible" (MI-SMC) method of Bartlett et al. (). We also apply the "MI matched set" approach of Seaman and Keogh () to nested case-control studies, which does not require any full-cohort information. The methods are investigated using simulation studies and all perform well when their assumptions hold. Substantial gains in efficiency can be made by imputing data missing by design using the full-cohort approach or by imputing data missing by chance in analyses using the substudy only. The intermediate approach brings greater gains in efficiency relative to the substudy approach and is more robust to imputation model misspecification than the full-cohort approach. The methods are illustrated using the ARIC Study cohort. Supplementary Materials provide R and Stata code. © 2018, The International Biometric Society.

  5. Case Method Teaching as Science and Art: A Metaphoric Approach and Curricular Application

    ERIC Educational Resources Information Center

    Greenhalgh, Anne M.

    2007-01-01

    The following article takes a metaphoric approach to case method teaching to shed light on one of our most important practices. The article hinges on the dual comparison of case method as science and as art. The dominant, scientific view of cases is that they are neutral descriptions of real-life business problems, subject to rigorous analysis.…

  6. Methodology or method? A critical review of qualitative case study reports

    PubMed Central

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980

  7. The "four quadrants" approach to clinical ethics case analysis; an application and review.

    PubMed

    Sokol, D K

    2008-07-01

    In 1982, Jonsen, Siegler and Winslade published Clinical Ethics, in which they described the "four quadrants" approach, a new method of analysing clinical ethics cases. Although the book is now in its 6th edition, a literature search has revealed only one academic paper demonstrating the method at work. This paper is an attempt to start filling this gap. As a way of describing and testing the approach, I apply the four quadrants method to a detailed clinical ethics case. The analysis is interspersed with reflections on the method itself. It is hoped that this experiment will encourage ethicists and clinicians to devote more attention to this neglected approach.

  8. A retrospective likelihood approach for efficient integration of multiple omics factors in case-control association studies.

    PubMed

    Balliu, Brunilda; Tsonaka, Roula; Boehringer, Stefan; Houwing-Duistermaat, Jeanine

    2015-03-01

    Integrative omics, the joint analysis of outcome and multiple types of omics data, such as genomics, epigenomics, and transcriptomics data, constitute a promising approach for powerful and biologically relevant association studies. These studies often employ a case-control design, and often include nonomics covariates, such as age and gender, that may modify the underlying omics risk factors. An open question is how to best integrate multiple omics and nonomics information to maximize statistical power in case-control studies that ascertain individuals based on the phenotype. Recent work on integrative omics have used prospective approaches, modeling case-control status conditional on omics, and nonomics risk factors. Compared to univariate approaches, jointly analyzing multiple risk factors with a prospective approach increases power in nonascertained cohorts. However, these prospective approaches often lose power in case-control studies. In this article, we propose a novel statistical method for integrating multiple omics and nonomics factors in case-control association studies. Our method is based on a retrospective likelihood function that models the joint distribution of omics and nonomics factors conditional on case-control status. The new method provides accurate control of Type I error rate and has increased efficiency over prospective approaches in both simulated and real data. © 2015 Wiley Periodicals, Inc.

  9. Developing the DESCARTE Model: The Design of Case Study Research in Health Care.

    PubMed

    Carolan, Clare M; Forbat, Liz; Smith, Annetta

    2016-04-01

    Case study is a long-established research tradition which predates the recent surge in mixed-methods research. Although a myriad of nuanced definitions of case study exist, seminal case study authors agree that the use of multiple data sources typify this research approach. The expansive case study literature demonstrates a lack of clarity and guidance in designing and reporting this approach to research. Informed by two reviews of the current health care literature, we posit that methodological description in case studies principally focuses on description of case study typology, which impedes the construction of methodologically clear and rigorous case studies. We draw from the case study and mixed-methods literature to develop the DESCARTE model as an innovative approach to the design, conduct, and reporting of case studies in health care. We examine how case study fits within the overall enterprise of qualitatively driven mixed-methods research, and the potential strengths of the model are considered. © The Author(s) 2015.

  10. Case Study Research Methodology in Nursing Research.

    PubMed

    Cope, Diane G

    2015-11-01

    Through data collection methods using a holistic approach that focuses on variables in a natural setting, qualitative research methods seek to understand participants' perceptions and interpretations. Common qualitative research methods include ethnography, phenomenology, grounded theory, and historic research. Another type of methodology that has a similar qualitative approach is case study research, which seeks to understand a phenomenon or case from multiple perspectives within a given real-world context.

  11. Using Student-Centered Cases in the Classroom: An Action Inquiry Approach to Leadership Development

    ERIC Educational Resources Information Center

    Foster, Pacey; Carboni, Inga

    2009-01-01

    This article addresses the concern that business schools are not adequately developing the practical leadership skills that are required in the real world of management. The article begins by discussing the limitations of traditional case methods for teaching behavioral skills. This approach is contrasted with an alternative case method drawn from…

  12. Implementation of Performance-Based Acquisition in Non-Western Countries

    DTIC Science & Technology

    2009-03-01

    narratives , phenomenologies , ethnographies , grounded theory studies , or case studies . The researcher collects...are biography, phenomenological study , grounded theory study , ethnography , and case study . The approach used for qualitative data collection method ... qualitative methods , such as the grounded theory approach to

  13. Real Options in Defense R and D: A Decision Tree Analysis Approach for Options to Defer, Abandon, and Expand

    DTIC Science & Technology

    2016-12-01

    chosen rather than complex ones , and responds to the criticism of the DTA approach. Chapter IV provides three separate case studies in defense R&D...defense R&D projects. To this end, the first section describes the case study method and the advantages of using simple models over more complex ones ...the analysis lacked empirical data and relied on subjective data, the analysis successfully combined the DTA approach with the case study method and

  14. Learning the Lessons of Leadership: Case Method Teaching with Interactive Computer-Based Tools and Film-Based Cases

    DTIC Science & Technology

    2008-03-01

    report describes how the AXL system capitalizes on the best practices of traditional case method instruction and addresses some of the limitations of...system were addressed in the AXL system, producing an innovative technology solution for delivering case method instruction. Several case method best ...approaches for addressing such problems. The report also documents how case method best practices in traditional classroom environments can be translated into

  15. A Homogenization Approach for Design and Simulation of Blast Resistant Composites

    NASA Astrophysics Data System (ADS)

    Sheyka, Michael

    Structural composites have been used in aerospace and structural engineering due to their high strength to weight ratio. Composite laminates have been successfully and extensively used in blast mitigation. This dissertation examines the use of the homogenization approach to design and simulate blast resistant composites. Three case studies are performed to examine the usefulness of different methods that may be used in designing and optimizing composite plates for blast resistance. The first case study utilizes a single degree of freedom system to simulate the blast and a reliability based approach. The first case study examines homogeneous plates and the optimal stacking sequence and plate thicknesses are determined. The second and third case studies use the homogenization method to calculate the properties of composite unit cell made of two different materials. The methods are integrated with dynamic simulation environments and advanced optimization algorithms. The second case study is 2-D and uses an implicit blast simulation, while the third case study is 3-D and simulates blast using the explicit blast method. Both case studies 2 and 3 rely on multi-objective genetic algorithms for the optimization process. Pareto optimal solutions are determined in case studies 2 and 3. Case study 3 is an integrative method for determining optimal stacking sequence, microstructure and plate thicknesses. The validity of the different methods such as homogenization, reliability, explicit blast modeling and multi-objective genetic algorithms are discussed. Possible extension of the methods to include strain rate effects and parallel computation is also examined.

  16. Impact of Active Teaching Methods Implemented on Therapeutic Chemistry Module: Performance and Impressions of First-Year Pharmacy Students

    ERIC Educational Resources Information Center

    Derfoufi, Sanae; Benmoussa, Adnane; El Harti, Jaouad; Ramli, Youssef; Taoufik, Jamal; Chaouir, Souad

    2015-01-01

    This study investigates the positive impact of the Case Method implemented during a 4- hours tutorial in "therapeutic chemistry module." We view the Case Method as one particular approach within the broader spectrum of problem based or inquiry based learning approaches. Sixty students were included in data analysis. A pre-test and…

  17. Case Study Observational Research: A Framework for Conducting Case Study Research Where Observation Data Are the Focus.

    PubMed

    Morgan, Sonya J; Pullon, Susan R H; Macdonald, Lindsay M; McKinlay, Eileen M; Gray, Ben V

    2017-06-01

    Case study research is a comprehensive method that incorporates multiple sources of data to provide detailed accounts of complex research phenomena in real-life contexts. However, current models of case study research do not particularly distinguish the unique contribution observation data can make. Observation methods have the potential to reach beyond other methods that rely largely or solely on self-report. This article describes the distinctive characteristics of case study observational research, a modified form of Yin's 2014 model of case study research the authors used in a study exploring interprofessional collaboration in primary care. In this approach, observation data are positioned as the central component of the research design. Case study observational research offers a promising approach for researchers in a wide range of health care settings seeking more complete understandings of complex topics, where contextual influences are of primary concern. Future research is needed to refine and evaluate the approach.

  18. Estimating time-varying exposure-outcome associations using case-control data: logistic and case-cohort analyses.

    PubMed

    Keogh, Ruth H; Mangtani, Punam; Rodrigues, Laura; Nguipdop Djomo, Patrick

    2016-01-05

    Traditional analyses of standard case-control studies using logistic regression do not allow estimation of time-varying associations between exposures and the outcome. We present two approaches which allow this. The motivation is a study of vaccine efficacy as a function of time since vaccination. Our first approach is to estimate time-varying exposure-outcome associations by fitting a series of logistic regressions within successive time periods, reusing controls across periods. Our second approach treats the case-control sample as a case-cohort study, with the controls forming the subcohort. In the case-cohort analysis, controls contribute information at all times they are at risk. Extensions allow left truncation, frequency matching and, using the case-cohort analysis, time-varying exposures. Simulations are used to investigate the methods. The simulation results show that both methods give correct estimates of time-varying effects of exposures using standard case-control data. Using the logistic approach there are efficiency gains by reusing controls over time and care should be taken over the definition of controls within time periods. However, using the case-cohort analysis there is no ambiguity over the definition of controls. The performance of the two analyses is very similar when controls are used most efficiently under the logistic approach. Using our methods, case-control studies can be used to estimate time-varying exposure-outcome associations where they may not previously have been considered. The case-cohort analysis has several advantages, including that it allows estimation of time-varying associations as a continuous function of time, while the logistic regression approach is restricted to assuming a step function form for the time-varying association.

  19. Teaching and the Case Method. Text, Cases, and Readings. Third Edition.

    ERIC Educational Resources Information Center

    Barnes, Louis B.; And Others

    This volume includes text, cases, and readings for a college faculty seminar to develop the knowledge, skills, and attitudes necessary for utilization of the case method approach to instruction. It builds on a long-term clinical research effort on the dynamics of the case method of teaching and application at Harvard Business School. In addition…

  20. Teaching with the Case Method Online: Pure versus Hybrid Approaches

    ERIC Educational Resources Information Center

    Webb, Harold W.; Gill, Grandon; Poe, Gary

    2005-01-01

    The impact of hybrid classroom/distance education approaches is examined in the context of the case method. Four distinct semester-long treatments, which varied mixes of classroom and online discussion, were used to teach a graduate MIS survey course. Specific findings suggest that by using Web technology, college instructors may offer students…

  1. Teaching ethics in the clinic. The theory and practice of moral case deliberation.

    PubMed

    Molewijk, A C; Abma, T; Stolper, M; Widdershoven, G

    2008-02-01

    A traditional approach to teaching medical ethics aims to provide knowledge about ethics. This is in line with an epistemological view on ethics in which moral expertise is assumed to be located in theoretical knowledge and not in the moral experience of healthcare professionals. The aim of this paper is to present an alternative, contextual approach to teaching ethics, which is grounded in a pragmatic-hermeneutical and dialogical ethics. This approach is called moral case deliberation. Within moral case deliberation, healthcare professionals bring in their actual moral questions during a structured dialogue. The ethicist facilitates the learning process by using various conversation methods in order to find answers to the case and to develop moral competencies. The case deliberations are not unique events, but are a structural part of the professional training on the work floor within healthcare institutions. This article presents the underlying theory on (teaching) ethics and illustrates this approach with an example of a moral case deliberation project in a Dutch psychiatric hospital. The project was evaluated using the method of responsive evaluation. This method provided us with rich information about the implementation process and effects the research process itself also lent support to the process of implementation.

  2. Using Case Studies in the Introductory Public Relations Course.

    ERIC Educational Resources Information Center

    Adams, William C.

    The case study method has received increased attention at both the graduate and undergraduate levels in a number of public relations programs. Unlike the Harvard managerial-oriented case studies, the approach useful in large, introductory public relations courses stems from a simplified team approach to classroom projects, case studies in the…

  3. Analysis of case-only studies accounting for genotyping error.

    PubMed

    Cheng, K F

    2007-03-01

    The case-only design provides one approach to assess possible interactions between genetic and environmental factors. It has been shown that if these factors are conditionally independent, then a case-only analysis is not only valid but also very efficient. However, a drawback of the case-only approach is that its conclusions may be biased by genotyping errors. In this paper, our main aim is to propose a method for analysis of case-only studies when these errors occur. We show that the bias can be adjusted through the use of internal validation data, which are obtained by genotyping some sampled individuals twice. Our analysis is based on a simple and yet highly efficient conditional likelihood approach. Simulation studies considered in this paper confirm that the new method has acceptable performance under genotyping errors.

  4. Case studies approach for an undergraduate astrobiology course

    NASA Astrophysics Data System (ADS)

    Burko, Lior M.; Enger, Sandra

    2013-04-01

    Case studies is a well known and widely used method in law schools, medical schools, and business schools, but relatively little used in physics or astronomy courses. We developed an astrobiology course based strongly on the case studies approach, and after teaching it first at the University of Alabama in Huntsville, we have adapted it and are now teaching it at Alabama A&M University, a HBCU. The case studies approach uses several well tested and successful teaching methods - including group work, peer instruction, current interest topics, just-in-time teaching, &c. We have found that certain styles of cases are more popular among students than other styles, and will revise our cases to reflect such student preferences. We chose astrobiology -- an inherently multidisciplinary field -- because of the popularity of the subject matter, its frequent appearance in the popular media (news stories about searches for life in the universe, the discovery of Earth-like exoplanets, etc, in addition to SciFi movies and novels), and the rapid current progress in the field. In this talk we review briefly the case studies method, the styles of cases used in our astrobiology course, and student response to the course as found in our assessment analysis.

  5. [Bath Plug Closure Method for Cerebrospinal Fluid Leakage by Endoscopic Endonasal Approach:Cooperative Treatment by Neurosurgeons and Otolaryngologists].

    PubMed

    Kawaguchi, Tomohiro; Arakawa, Kazuya; Nomura, Kazuhiro; Ogawa, Yoshikazu; Katori, Yukio; Tominaga, Teiji

    2017-12-01

    Endoscopic endonasal surgery, an innovative surgical technique, is used to approach sinus lesions, lesions of the skull base, and intradural tumors. The cooperation of experienced otolaryngologists and neurosurgeons is important to achieve safe and reliable surgical results. The bath plug closure method is a treatment option for patients with cerebrospinal fluid(CSF)leakage. Although it includes dural and/or intradural procedures, surgery tends to be performed by otolaryngologists because its indications, detailed maneuvers, and pitfalls are not well recognized by neurosurgeons. We reviewed the cases of patients with CSF leakage treated by using the bath plug closure method with an endoscopic endonasal approach at our institution. Three patients were treated using the bath plug closure method. CSF leakage was caused by a meningocele in two cases and trauma in one case. No postoperative intracranial complications or recurrence of CSF leakage were observed. The bath plug closure method is an effective treatment strategy and allows neurosurgeons to gain in-depth knowledge of the treatment options for CSF leakage by using an endoscopic endonasal approach.

  6. Model-free methods to study membrane environmental probes: a comparison of the spectral phasor and generalized polarization approaches

    PubMed Central

    Malacrida, Leonel; Gratton, Enrico; Jameson, David M

    2016-01-01

    In this note, we present a discussion of the advantages and scope of model-free analysis methods applied to the popular solvatochromic probe LAURDAN, which is widely used as an environmental probe to study dynamics and structure in membranes. In particular, we compare and contrast the generalized polarization approach with the spectral phasor approach. To illustrate our points we utilize several model membrane systems containing pure lipid phases and, in some cases, cholesterol or surfactants. We demonstrate that the spectral phasor method offers definitive advantages in the case of complex systems. PMID:27182438

  7. Developing comparative criminology and the case of China: an introduction.

    PubMed

    Liu, Jianhong

    2007-02-01

    Although comparative criminology has made significant development during the past decade or so, systematic empirical research has only developed along a few topics. Comparative criminology has never occupied a central position in criminology. This article analyzes the major theoretical and methodological impediments in the development of comparative criminology. It stresses a need to shift methodology from a conventional primary approach that uses the nation as the unit of analysis to an in-depth case study method as a primary methodological approach. The article maintains that case study method can overcome the limitation of its descriptive tradition and become a promising methodological approach for comparative criminology.

  8. Evidence Arguments for Using Formal Methods in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh

    2013-01-01

    We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.

  9. Comparison of linear and nonlinear programming approaches for "worst case dose" and "minmax" robust optimization of intensity-modulated proton therapy dose distributions.

    PubMed

    Zaghian, Maryam; Cao, Wenhua; Liu, Wei; Kardar, Laleh; Randeniya, Sharmalee; Mohan, Radhe; Lim, Gino

    2017-03-01

    Robust optimization of intensity-modulated proton therapy (IMPT) takes uncertainties into account during spot weight optimization and leads to dose distributions that are resilient to uncertainties. Previous studies demonstrated benefits of linear programming (LP) for IMPT in terms of delivery efficiency by considerably reducing the number of spots required for the same quality of plans. However, a reduction in the number of spots may lead to loss of robustness. The purpose of this study was to evaluate and compare the performance in terms of plan quality and robustness of two robust optimization approaches using LP and nonlinear programming (NLP) models. The so-called "worst case dose" and "minmax" robust optimization approaches and conventional planning target volume (PTV)-based optimization approach were applied to designing IMPT plans for five patients: two with prostate cancer, one with skull-based cancer, and two with head and neck cancer. For each approach, both LP and NLP models were used. Thus, for each case, six sets of IMPT plans were generated and assessed: LP-PTV-based, NLP-PTV-based, LP-worst case dose, NLP-worst case dose, LP-minmax, and NLP-minmax. The four robust optimization methods behaved differently from patient to patient, and no method emerged as superior to the others in terms of nominal plan quality and robustness against uncertainties. The plans generated using LP-based robust optimization were more robust regarding patient setup and range uncertainties than were those generated using NLP-based robust optimization for the prostate cancer patients. However, the robustness of plans generated using NLP-based methods was superior for the skull-based and head and neck cancer patients. Overall, LP-based methods were suitable for the less challenging cancer cases in which all uncertainty scenarios were able to satisfy tight dose constraints, while NLP performed better in more difficult cases in which most uncertainty scenarios were hard to meet tight dose limits. For robust optimization, the worst case dose approach was less sensitive to uncertainties than was the minmax approach for the prostate and skull-based cancer patients, whereas the minmax approach was superior for the head and neck cancer patients. The robustness of the IMPT plans was remarkably better after robust optimization than after PTV-based optimization, and the NLP-PTV-based optimization outperformed the LP-PTV-based optimization regarding robustness of clinical target volume coverage. In addition, plans generated using LP-based methods had notably fewer scanning spots than did those generated using NLP-based methods. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  10. A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.

    2002-01-01

    In this paper we present a comparison of optimization approaches to the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP), Quasi-Newton, Simplex, Genetic Algorithms, and Simulated Annealing. Each method is applied to a variety of test cases including, circular to circular coplanar orbits, LEO to GEO, and orbit phasing in highly elliptic orbits. We also compare different constrained optimization routines on complex orbit rendezvous problems with complicated, highly nonlinear constraints.

  11. Determining health-care facility catchment areas in Uganda using data on malaria-related visits

    PubMed Central

    Charland, Katia; Kigozi, Ruth; Dorsey, Grant; Kamya, Moses R; Buckeridge, David L

    2014-01-01

    Abstract Objective To illustrate the use of a new method for defining the catchment areas of health-care facilities based on their utilization. Methods The catchment areas of six health-care facilities in Uganda were determined using the cumulative case ratio: the ratio of the observed to expected utilization of a facility for a particular condition by patients from small administrative areas. The cumulative case ratio for malaria-related visits to these facilities was determined using data from the Uganda Malaria Surveillance Project. Catchment areas were also derived using various straight line and road network distances from the facility. Subsequently, the 1-year cumulative malaria case rate was calculated for each catchment area, as determined using the three methods. Findings The 1-year cumulative malaria case rate varied considerably with the method used to define the catchment areas. With the cumulative case ratio approach, the catchment area could include noncontiguous areas. With the distance approaches, the denominator increased substantially with distance, whereas the numerator increased only slightly. The largest cumulative case rate per 1000 population was for the Kamwezi facility: 234.9 (95% confidence interval, CI: 226.2–243.8) for a straight-line distance of 5 km, 193.1 (95% CI: 186.8–199.6) for the cumulative case ratio approach and 156.1 (95% CI: 150.9–161.4) for a road network distance of 5 km. Conclusion Use of the cumulative case ratio for malaria-related visits to determine health-care facility catchment areas was feasible. Moreover, this approach took into account patients’ actual addresses, whereas using distance from the facility did not. PMID:24700977

  12. Treating alcoholism through a narrative approach. Case study and rationale.

    PubMed Central

    Kaminsky, D.; Rabinowitz, S.; Kasan, R.

    1996-01-01

    A case study illustrates the narrative or story-telling approach to treating alcoholism. We discuss the rationale for this method and describe how it could be useful in family practice for treating people with alcohol problems. PMID:8653035

  13. A path-level exact parallelization strategy for sequential simulation

    NASA Astrophysics Data System (ADS)

    Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.

    2018-01-01

    Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

  14. Community-Involved Learning to Expand Possibilities for Vulnerable Children: A Critical Communicative, Sen's Capability, and Action Research Approach

    ERIC Educational Resources Information Center

    Kim, Kyung Hi

    2014-01-01

    This research, based on a case study of vulnerable children in Korea, used a mixed methods transformative approach to explore strategies to support and help disadvantaged children. The methodological approach includes three phases: a mixed methods contextual analysis, a qualitative dominant analysis based on Sen's capability approach and critical…

  15. Green University Initiatives in China: A Case of Tsinghua University

    ERIC Educational Resources Information Center

    Zhao, Wanxia; Zou, Yonghua

    2015-01-01

    Purpose: The purpose of this paper is to examine green university initiatives in the context of China, using Tsinghua University, which is China's green university pioneer, as a case study. Design/methodology/approach: The research method used for this paper is a case study based on participant observation and document analysis. The approach to…

  16. The Case Study Method: Guidelines, Practices, and Applications for Vocational Education. Research and Development Series No. 189.

    ERIC Educational Resources Information Center

    Spirer, Janet E.

    In comparison with traditional experimental design, which is concerned with what happened, a case study approach is more appropriate for answering the question of why or how something happened. As an alternative complementary-vocational-education-evaluation approach, the case study attempts to describe and analyze some program in comprehensive…

  17. SU-F-E-15: Initial Experience Implementing a Case Method Teaching Approach to Radiation Oncology Physics Residents, Graduate Students and Doctorate of Medical Physics Students

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, A

    Purpose: Case Method Teaching approach is a teaching tool used commonly in business school to challenge students with real-world situations—i.e. cases. The students are placed in the role of the decision maker and have to provide a solution based on the multitude of information provided. Specifically, students must develop an ability to quickly make sense of a complex problem, provide a solution incorporating all of the objectives (at time conflicting) and constraints, and communicate that solution in a succinct, professional and effective manner. The validity of the solution is highly dependent on the auxiliary information provided in the case andmore » the basic didactic knowledge of the student. A Case Method Teaching approach was developed and implemented into an on-going course focused on AAPM Task Group reports at UTHSCSA. Methods: A current course at UTHSCSA reviews and discusses 15 AAPM Task Group reports per semester. The course is structured into three topic modules: Imaging QA, Stereotactic Radiotherapy, and Special Patient Measurements—i.e. pacemakers, fetal dose. After a topic module is complete, the students are divided into groups (2–3 people) and are asked to review a case study related to the module topic. Students then provide a solution presented in an executive summary and class presentation. Results: Case studies were created to address each module topic. Through team work and whole-class discussion, a collaborative learning environment was established. Students additionally learned concepts such vendor relations, financial negotiations, capital project management, and competitive strategy. Conclusion: Case Method Teaching approach is an effective teaching tool to further enhance the learning experience of radiation oncology physics students by presenting them with though-provoking dilemmas that require students to distinguish pertinent from peripheral information, formulate strategies and recommendations for action, and confront obstacles to implementation.« less

  18. Short-term solar flare prediction using image-case-based reasoning

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Fu; Li, Fei; Zhang, Huai-Peng; Yu, Da-Ren

    2017-10-01

    Solar flares strongly influence space weather and human activities, and their prediction is highly complex. The existing solutions such as data based approaches and model based approaches have a common shortcoming which is the lack of human engagement in the forecasting process. An image-case-based reasoning method is introduced to achieve this goal. The image case library is composed of SOHO/MDI longitudinal magnetograms, the images from which exhibit the maximum horizontal gradient, the length of the neutral line and the number of singular points that are extracted for retrieving similar image cases. Genetic optimization algorithms are employed for optimizing the weight assignment for image features and the number of similar image cases retrieved. Similar image cases and prediction results derived by majority voting for these similar image cases are output and shown to the forecaster in order to integrate his/her experience with the final prediction results. Experimental results demonstrate that the case-based reasoning approach has slightly better performance than other methods, and is more efficient with forecasts improved by humans.

  19. Developing a new case based computer-aided detection scheme and an adaptive cueing method to improve performance in detecting mammographic lesions

    PubMed Central

    Tan, Maxine; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin

    2017-01-01

    The purpose of this study is to evaluate a new method to improve performance of computer-aided detection (CAD) schemes of screening mammograms with two approaches. In the first approach, we developed a new case based CAD scheme using a set of optimally selected global mammographic density, texture, spiculation, and structural similarity features computed from all four full-field digital mammography (FFDM) images of the craniocaudal (CC) and mediolateral oblique (MLO) views by using a modified fast and accurate sequential floating forward selection feature selection algorithm. Selected features were then applied to a “scoring fusion” artificial neural network (ANN) classification scheme to produce a final case based risk score. In the second approach, we combined the case based risk score with the conventional lesion based scores of a conventional lesion based CAD scheme using a new adaptive cueing method that is integrated with the case based risk scores. We evaluated our methods using a ten-fold cross-validation scheme on 924 cases (476 cancer and 448 recalled or negative), whereby each case had all four images from the CC and MLO views. The area under the receiver operating characteristic curve was AUC = 0.793±0.015 and the odds ratio monotonically increased from 1 to 37.21 as CAD-generated case based detection scores increased. Using the new adaptive cueing method, the region based and case based sensitivities of the conventional CAD scheme at a false positive rate of 0.71 per image increased by 2.4% and 0.8%, respectively. The study demonstrated that supplementary information can be derived by computing global mammographic density image features to improve CAD-cueing performance on the suspicious mammographic lesions. PMID:27997380

  20. The Perioperative Experience of the Ambulatory Surgery Patient

    DTIC Science & Technology

    2000-10-01

    qualitative research approaches used by nurses and human science researchers are phenomenology , grounded theory , ethnography , history, case studies , and... Qualitative Method 22 Rationale for Phenomenological Approach 22 Philosophical Underpinnings of Qualitative Research Methods 22 ix Method of Inquiry...integrated theory of the phenomenon under investigation. Theories in

  1. A Mixed-Methods Exploration of an Environment for Learning Computer Programming

    ERIC Educational Resources Information Center

    Mather, Richard

    2015-01-01

    A mixed-methods approach is evaluated for exploring collaborative behaviour, acceptance and progress surrounding an interactive technology for learning computer programming. A review of literature reveals a compelling case for using mixed-methods approaches when evaluating technology-enhanced-learning environments. Here, ethnographic approaches…

  2. Cultural Identity and Regional Security in the Western Balkans

    DTIC Science & Technology

    2013-06-13

    possible. Case Study as Qualitative Approach Creswell and other experts of the social research methodology suggest at least five forms of...descriptive research approach, and the main method is case study of the Western Balkans. This thesis utilizes the analytical frameworks of securitization

  3. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    PubMed Central

    De Esteban-Trivigno, Soledad; Püschel, Thomas A.; Fortuny, Josep

    2017-01-01

    Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches. PMID:29043107

  4. Case-based explanation of non-case-based learning methods.

    PubMed Central

    Caruana, R.; Kangarloo, H.; Dionisio, J. D.; Sinha, U.; Johnson, D.

    1999-01-01

    We show how to generate case-based explanations for non-case-based learning methods such as artificial neural nets or decision trees. The method uses the trained model (e.g., the neural net or the decision tree) as a distance metric to determine which cases in the training set are most similar to the case that needs to be explained. This approach is well suited to medical domains, where it is important to understand predictions made by complex machine learning models, and where training and clinical practice makes users adept at case interpretation. PMID:10566351

  5. Integrating qualitative research into occupational health: a case study among hospital workers.

    PubMed

    Gordon, Deborah R; Ames, Genevieve M; Yen, Irene H; Gillen, Marion; Aust, Birgit; Rugulies, Reiner; Frank, John W; Blanc, Paul D

    2005-04-01

    We sought to better use qualitative approaches in occupational health research and integrate them with quantitative methods. We systematically reviewed, selected, and adapted qualitative research methods as part of a multisite study of the predictors and outcomes of work-related musculoskeletal disorders among hospital workers in two large urban tertiary hospitals. The methods selected included participant observation; informal, open-ended, and semistructured interviews with individuals or small groups; and archival study. The nature of the work and social life of the hospitals and the foci of the study all favored using more participant observation methods in the case study than initially anticipated. Exploiting the full methodological spectrum of qualitative methods in occupational health is increasingly relevant. Although labor-intensive, these approaches may increase the yield of established quantitative approaches otherwise used in isolation.

  6. An approach to checking case-crossover analyses based on equivalence with time-series methods.

    PubMed

    Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L

    2008-03-01

    The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.

  7. On some methods for assessing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  8. Thermodynamic heuristics with case-based reasoning: combined insights for RNA pseudoknot secondary structure.

    PubMed

    Al-Khatib, Ra'ed M; Rashid, Nur'Aini Abdul; Abdullah, Rosni

    2011-08-01

    The secondary structure of RNA pseudoknots has been extensively inferred and scrutinized by computational approaches. Experimental methods for determining RNA structure are time consuming and tedious; therefore, predictive computational approaches are required. Predicting the most accurate and energy-stable pseudoknot RNA secondary structure has been proven to be an NP-hard problem. In this paper, a new RNA folding approach, termed MSeeker, is presented; it includes KnotSeeker (a heuristic method) and Mfold (a thermodynamic algorithm). The global optimization of this thermodynamic heuristic approach was further enhanced by using a case-based reasoning technique as a local optimization method. MSeeker is a proposed algorithm for predicting RNA pseudoknot structure from individual sequences, especially long ones. This research demonstrates that MSeeker improves the sensitivity and specificity of existing RNA pseudoknot structure predictions. The performance and structural results from this proposed method were evaluated against seven other state-of-the-art pseudoknot prediction methods. The MSeeker method had better sensitivity than the DotKnot, FlexStem, HotKnots, pknotsRG, ILM, NUPACK and pknotsRE methods, with 79% of the predicted pseudoknot base-pairs being correct.

  9. Implementing a Flipped Classroom Approach in a University Numerical Methods Mathematics Course

    ERIC Educational Resources Information Center

    Johnston, Barbara M.

    2017-01-01

    This paper describes and analyses the implementation of a "flipped classroom" approach, in an undergraduate mathematics course on numerical methods. The approach replaced all the lecture contents by instructor-made videos and was implemented in the consecutive years 2014 and 2015. The sequential case study presented here begins with an…

  10. Natural Environment Exploration Approach: The Case Study in Department of Biology, Universitas Negeri Semarang

    ERIC Educational Resources Information Center

    Alimah, Siti; Susilo, Herawati; Amin, Moh

    2016-01-01

    The study reports the evaluation and analysis of the implementation of the Nature Environment Exploration approach in the Department of Biology, Universitas Negeri Semarang State University. The method used was survey method. The results showed that the implementation of the Nature Environment Exploration approach was still far from optimal…

  11. [Introduction of active learning and student readership in teaching by the pharmaceutical faculty].

    PubMed

    Sekiguchi, Masaki; Yamato, Ippei; Kato, Tetsuta; Torigoe, Kojyun

    2005-07-01

    We have introduced improvements and new approaches into our teaching methods by exploiting 4 active learning methods for pharmacy students of first year. The 4 teaching methods for each lesson or take home assignment are follows: 1) problem-based learning (clinical case) including a student presentation of the clinical case, 2) schematic drawings of the human organs, one drawing done in 15-20 min during the week following a lecture and a second drawing done with reference to a professional textbook, 3) learning of professional themes in take home assignments, and 4) short test in order to confirm the understanding of technical terms by using paper or computer. These improvements and new methods provide active approaches for pharmacy students (as opposed to passive memorization of words and image study). In combination, they have proven to be useful as a learning method to acquire expert knowledge and to convert from passive learning approach to active learning approach of pharmacy students in the classroom.

  12. The Effectiveness of Teaching Methods Used in Graphic Design Pedagogy in Both Analogue and Digital Education Systems

    ERIC Educational Resources Information Center

    Alhajri, Salman

    2016-01-01

    Purpose: this paper investigates the effectiveness of teaching methods used in graphic design pedagogy in both analogue and digital education systems. Methodology and approach: the paper is based on theoretical study using a qualitative, case study approach. Comparison between the digital teaching methods and traditional teaching methods was…

  13. The Case Method for Teaching College Economics

    ERIC Educational Resources Information Center

    Griffith, John R., Jr.

    1971-01-01

    The author explains the use of the case method of applying the principles of economics to current economic policy issues. This approach allows the classroom economics teacher to teach the student how to apply economic principles to real life problems. (Author)

  14. Monte Carlo methods and their analysis for Coulomb collisions in multicomponent plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bobylev, A.V., E-mail: alexander.bobylev@kau.se; Potapenko, I.F., E-mail: firena@yandex.ru

    2013-08-01

    Highlights: •A general approach to Monte Carlo methods for multicomponent plasmas is proposed. •We show numerical tests for the two-component (electrons and ions) case. •An optimal choice of parameters for speeding up the computations is discussed. •A rigorous estimate of the error of approximation is proved. -- Abstract: A general approach to Monte Carlo methods for Coulomb collisions is proposed. Its key idea is an approximation of Landau–Fokker–Planck equations by Boltzmann equations of quasi-Maxwellian kind. It means that the total collision frequency for the corresponding Boltzmann equation does not depend on the velocities. This allows to make the simulation processmore » very simple since the collision pairs can be chosen arbitrarily, without restriction. It is shown that this approach includes the well-known methods of Takizuka and Abe (1977) [12] and Nanbu (1997) as particular cases, and generalizes the approach of Bobylev and Nanbu (2000). The numerical scheme of this paper is simpler than the schemes by Takizuka and Abe [12] and by Nanbu. We derive it for the general case of multicomponent plasmas and show some numerical tests for the two-component (electrons and ions) case. An optimal choice of parameters for speeding up the computations is also discussed. It is also proved that the order of approximation is not worse than O(√(ε)), where ε is a parameter of approximation being equivalent to the time step Δt in earlier methods. A similar estimate is obtained for the methods of Takizuka and Abe and Nanbu.« less

  15. A qualitative and quantitative assessment for a bone marrow harvest simulator.

    PubMed

    Machado, Liliane S; Moraes, Ronei M

    2009-01-01

    Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.

  16. CASE STUDY CRITIQUE; UPPER CLINCH CASE STUDY

    EPA Science Inventory

    Case study critique: Upper Clinch case study (from Research on Methods for Integrating Ecological Economics and Ecological Risk Assessment: A Trade-off Weighted Index Approach to Integrating Economics and Ecological Risk Assessment). This critique answers the questions: 1) does ...

  17. Leadership Learning through Student-Centered and Inquiry-Focused Approaches to Teaching Adaptive Leadership

    ERIC Educational Resources Information Center

    Haber-Curran, Paige; Tillapaugh, Daniel

    2013-01-01

    This qualitative study examines student learning about leadership across three sections of a capstone course in an undergraduate leadership minor. Qualitative methods were informed by exploratory case study analysis and phenomenology. Student-centered and inquiry-focused pedagogical approaches, including case-in-point, action inquiry, and…

  18. A Case Study Using Child-Centered Play Therapy Approach to Treat Enuresis and Encopresis.

    ERIC Educational Resources Information Center

    Cuddy-Casey, Maria

    1997-01-01

    Demonstrates an alternative method (nondirective child-centered therapy) in treating enuresis and encopresis resulting from emotional disturbances. Examines various etiologies and approaches to treating these conditions. Provides a case study example. Claims that professionals must differentiate between primary and secondary occurrences of these…

  19. A simulation study to compare three self-controlled case series approaches: correction for violation of assumption and evaluation of bias.

    PubMed

    Hua, Wei; Sun, Guoying; Dodd, Caitlin N; Romio, Silvana A; Whitaker, Heather J; Izurieta, Hector S; Black, Steven; Sturkenboom, Miriam C J M; Davis, Robert L; Deceuninck, Genevieve; Andrews, N J

    2013-08-01

    The assumption that the occurrence of outcome event must not alter subsequent exposure probability is critical for preserving the validity of the self-controlled case series (SCCS) method. This assumption is violated in scenarios in which the event constitutes a contraindication for exposure. In this simulation study, we compared the performance of the standard SCCS approach and two alternative approaches when the event-independent exposure assumption was violated. Using the 2009 H1N1 and seasonal influenza vaccines and Guillain-Barré syndrome as a model, we simulated a scenario in which an individual may encounter multiple unordered exposures and each exposure may be contraindicated by the occurrence of outcome event. The degree of contraindication was varied at 0%, 50%, and 100%. The first alternative approach used only cases occurring after exposure with follow-up time starting from exposure. The second used a pseudo-likelihood method. When the event-independent exposure assumption was satisfied, the standard SCCS approach produced nearly unbiased relative incidence estimates. When this assumption was partially or completely violated, two alternative SCCS approaches could be used. While the post-exposure cases only approach could handle only one exposure, the pseudo-likelihood approach was able to correct bias for both exposures. Violation of the event-independent exposure assumption leads to an overestimation of relative incidence which could be corrected by alternative SCCS approaches. In multiple exposure situations, the pseudo-likelihood approach is optimal; the post-exposure cases only approach is limited in handling a second exposure and may introduce additional bias, thus should be used with caution. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    PubMed

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  1. Developing Emotion-Based Case Formulations: A Research-Informed Method.

    PubMed

    Pascual-Leone, Antonio; Kramer, Ueli

    2017-01-01

    New research-informed methods for case conceptualization that cut across traditional therapy approaches are increasingly popular. This paper presents a trans-theoretical approach to case formulation based on the research observations of emotion. The sequential model of emotional processing (Pascual-Leone & Greenberg, 2007) is a process research model that provides concrete markers for therapists to observe the emerging emotional development of their clients. We illustrate how this model can be used by clinicians to track change and provides a 'clinical map,' by which therapist may orient themselves in-session and plan treatment interventions. Emotional processing offers as a trans-theoretical framework for therapists who wish to conduct emotion-based case formulations. First, we present criteria for why this research model translates well into practice. Second, two contrasting case studies are presented to demonstrate the method. The model bridges research with practice by using client emotion as an axis of integration. Key Practitioner Message Process research on emotion can offer a template for therapists to make case formulations while using a range of treatment approaches. The sequential model of emotional processing provides a 'process map' of concrete markers for therapists to (1) observe the emerging emotional development of their clients, and (2) help therapists develop a treatment plan. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Case Writing as a Signature Pedagogy in Education Leadership

    ERIC Educational Resources Information Center

    Meyer, Heinz-Dieter; Shannon, Brenda

    2010-01-01

    Purpose: The purpose of this paper is to propose, as a candidate for a signature pedagogy, a method centered on case writing and peer review. Design/methodology/approach: In this method, aspiring education leaders use the writing of case studies--frequently featuring themselves as an actor in a narrative of organizational development or change--to…

  3. Administering the Business School Case Method with a Goal-Based Scenario.

    ERIC Educational Resources Information Center

    Foster, David A.; Bareiss, Ray

    This paper discusses some of the shortcomings of the business case method of undergraduate and graduate business education and examines the merits of a multimedia software system that is designed to teach topics and skills in financial accounting. It argues that the traditional case-based approach provides only limited assistance to students as…

  4. The Improving Way of Logistics Management in Korean Army

    DTIC Science & Technology

    2011-03-01

    case study , ethnography , phenomenological study , ground theory study and content 18 analysis. The case study deals with a...there are five kinds of qualitative research approaches. To decide which method to be chosen, one should consider the purpose, focus, method of data... qualitative research , case study is good to find the answer when the researcher has no

  5. Two Reconfigurable Flight-Control Design Methods: Robust Servomechanism and Control Allocation

    NASA Technical Reports Server (NTRS)

    Burken, John J.; Lu, Ping; Wu, Zheng-Lu; Bahm, Cathy

    2001-01-01

    Two methods for control system reconfiguration have been investigated. The first method is a robust servomechanism control approach (optimal tracking problem) that is a generalization of the classical proportional-plus-integral control to multiple input-multiple output systems. The second method is a control-allocation approach based on a quadratic programming formulation. A globally convergent fixed-point iteration algorithm has been developed to make onboard implementation of this method feasible. These methods have been applied to reconfigurable entry flight control design for the X-33 vehicle. Examples presented demonstrate simultaneous tracking of angle-of-attack and roll angle commands during failures of the fight body flap actuator. Although simulations demonstrate success of the first method in most cases, the control-allocation method appears to provide uniformly better performance in all cases.

  6. On mathematical modelling of aeroelastic problems with finite element method

    NASA Astrophysics Data System (ADS)

    Sváček, Petr

    2018-06-01

    This paper is interested in solution of two-dimensional aeroelastic problems. Two mathematical models are compared for a benchmark problem. First, the classical approach of linearized aerodynamical forces is described to determine the aeroelastic instability and the aeroelastic response in terms of frequency and damping coefficient. This approach is compared to the coupled fluid-structure model solved with the aid of finite element method used for approximation of the incompressible Navier-Stokes equations. The finite element approximations are coupled to the non-linear motion equations of a flexibly supported airfoil. Both methods are first compared for the case of small displacement, where the linearized approach can be well adopted. The influence of nonlinearities for the case of post-critical regime is discussed.

  7. Cost-Effectiveness Analysis of Three Leprosy Case Detection Methods in Northern Nigeria

    PubMed Central

    Ezenduka, Charles; Post, Erik; John, Steven; Suraj, Abdulkarim; Namadi, Abdulahi; Onwujekwe, Obinna

    2012-01-01

    Background Despite several leprosy control measures in Nigeria, child proportion and disability grade 2 cases remain high while new cases have not significantly reduced, suggesting continuous spread of the disease. Hence, there is the need to review detection methods to enhance identification of early cases for effective control and prevention of permanent disability. This study evaluated the cost-effectiveness of three leprosy case detection methods in Northern Nigeria to identify the most cost-effective approach for detection of leprosy. Methods A cross-sectional study was carried out to evaluate the additional benefits of using several case detection methods in addition to routine practice in two north-eastern states of Nigeria. Primary and secondary data were collected from routine practice records and the Nigerian Tuberculosis and Leprosy Control Programme of 2009. The methods evaluated were Rapid Village Survey (RVS), Household Contact Examination (HCE) and Traditional Healers incentive method (TH). Effectiveness was measured as number of new leprosy cases detected and cost-effectiveness was expressed as cost per case detected. Costs were measured from both providers' and patients' perspectives. Additional costs and effects of each method were estimated by comparing each method against routine practise and expressed as incremental cost-effectiveness ratio (ICER). All costs were converted to the U.S. dollar at the 2010 exchange rate. Univariate sensitivity analysis was used to evaluate uncertainties around the ICER. Results The ICER for HCE was $142 per additional case detected at all contact levels and it was the most cost-effective method. At ICER of $194 per additional case detected, THs method detected more cases at a lower cost than the RVS, which was not cost-effective at $313 per additional case detected. Sensitivity analysis showed that varying the proportion of shared costs and subsistent wage for valuing unpaid time did not significantly change the results. Conclusion Complementing routine practice with household contact examination is the most cost-effective approach to identify new leprosy cases and we recommend that, depending on acceptability and feasibility, this intervention is introduced for improved case detection in Northern Nigeria. PMID:23029580

  8. Piecewise exponential survival times and analysis of case-cohort data.

    PubMed

    Li, Yan; Gail, Mitchell H; Preston, Dale L; Graubard, Barry I; Lubin, Jay H

    2012-06-15

    Case-cohort designs select a random sample of a cohort to be used as control with cases arising from the follow-up of the cohort. Analyses of case-cohort studies with time-varying exposures that use Cox partial likelihood methods can be computer intensive. We propose a piecewise-exponential approach where Poisson regression model parameters are estimated from a pseudolikelihood and the corresponding variances are derived by applying Taylor linearization methods that are used in survey research. The proposed approach is evaluated using Monte Carlo simulations. An illustration is provided using data from the Alpha-Tocopherol, Beta-Carotene Cancer Prevention Study of male smokers in Finland, where a case-cohort study of serum glucose level and pancreatic cancer was analyzed. Copyright © 2012 John Wiley & Sons, Ltd.

  9. An efficient approach for surveillance of childhood diabetes by type derived from electronic health record data: the SEARCH for Diabetes in Youth Study

    PubMed Central

    Zhong, Victor W; Obeid, Jihad S; Craig, Jean B; Pfaff, Emily R; Thomas, Joan; Jaacks, Lindsay M; Beavers, Daniel P; Carey, Timothy S; Lawrence, Jean M; Dabelea, Dana; Hamman, Richard F; Bowlby, Deborah A; Pihoker, Catherine; Saydah, Sharon H

    2016-01-01

    Objective To develop an efficient surveillance approach for childhood diabetes by type across 2 large US health care systems, using phenotyping algorithms derived from electronic health record (EHR) data. Materials and Methods Presumptive diabetes cases <20 years of age from 2 large independent health care systems were identified as those having ≥1 of the 5 indicators in the past 3.5 years, including elevated HbA1c, elevated blood glucose, diabetes-related billing codes, patient problem list, and outpatient anti-diabetic medications. EHRs of all the presumptive cases were manually reviewed, and true diabetes status and diabetes type were determined. Algorithms for identifying diabetes cases overall and classifying diabetes type were either prespecified or derived from classification and regression tree analysis. Surveillance approach was developed based on the best algorithms identified. Results We developed a stepwise surveillance approach using billing code–based prespecified algorithms and targeted manual EHR review, which efficiently and accurately ascertained and classified diabetes cases by type, in both health care systems. The sensitivity and positive predictive values in both systems were approximately ≥90% for ascertaining diabetes cases overall and classifying cases with type 1 or type 2 diabetes. About 80% of the cases with “other” type were also correctly classified. This stepwise surveillance approach resulted in a >70% reduction in the number of cases requiring manual validation compared to traditional surveillance methods. Conclusion EHR data may be used to establish an efficient approach for large-scale surveillance for childhood diabetes by type, although some manual effort is still needed. PMID:27107449

  10. Method of the active contour for segmentation of bone systems on bitmap images

    NASA Astrophysics Data System (ADS)

    Vu, Hai Anh; Safonov, Roman A.; Kolesnikova, Anna S.; Kirillova, Irina V.; Kossovich, Leonid U.

    2018-02-01

    It is developed within a method of the active contours the approach, which is allowing to realize separation of a contour of a object of the image in case of its segmentation. This approach exceeds a parametric method on speed, but also does not concede to it on decision accuracy. The approach is offered within this operation will allow to realize allotment of a contour with high accuracy of the image and quicker than a parametric method of the active contours.

  11. Reconfigurable Flight Control Designs With Application to the X-33 Vehicle

    NASA Technical Reports Server (NTRS)

    Burken, John J.; Lu, Ping; Wu, Zhenglu

    1999-01-01

    Two methods for control system reconfiguration have been investigated. The first method is a robust servomechanism control approach (optimal tracking problem) that is a generalization of the classical proportional-plus-integral control to multiple input-multiple output systems. The second method is a control-allocation approach based on a quadratic programming formulation. A globally convergent fixed-point iteration algorithm has been developed to make onboard implementation of this method feasible. These methods have been applied to reconfigurable entry flight control design for the X-33 vehicle. Examples presented demonstrate simultaneous tracking of angle-of-attack and roll angle commands during failures of the right body flap actuator. Although simulations demonstrate success of the first method in most cases, the control-allocation method appears to provide uniformly better performance in all cases.

  12. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    PubMed

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  13. A collaborative approach to developing an electronic health record phenotyping algorithm for drug-induced liver injury

    PubMed Central

    Overby, Casey Lynnette; Pathak, Jyotishman; Gottesman, Omri; Haerian, Krystl; Perotte, Adler; Murphy, Sean; Bruce, Kevin; Johnson, Stephanie; Talwalkar, Jayant; Shen, Yufeng; Ellis, Steve; Kullo, Iftikhar; Chute, Christopher; Friedman, Carol; Bottinger, Erwin; Hripcsak, George; Weng, Chunhua

    2013-01-01

    Objective To describe a collaborative approach for developing an electronic health record (EHR) phenotyping algorithm for drug-induced liver injury (DILI). Methods We analyzed types and causes of differences in DILI case definitions provided by two institutions—Columbia University and Mayo Clinic; harmonized two EHR phenotyping algorithms; and assessed the performance, measured by sensitivity, specificity, positive predictive value, and negative predictive value, of the resulting algorithm at three institutions except that sensitivity was measured only at Columbia University. Results Although these sites had the same case definition, their phenotyping methods differed by selection of liver injury diagnoses, inclusion of drugs cited in DILI cases, laboratory tests assessed, laboratory thresholds for liver injury, exclusion criteria, and approaches to validating phenotypes. We reached consensus on a DILI phenotyping algorithm and implemented it at three institutions. The algorithm was adapted locally to account for differences in populations and data access. Implementations collectively yielded 117 algorithm-selected cases and 23 confirmed true positive cases. Discussion Phenotyping for rare conditions benefits significantly from pooling data across institutions. Despite the heterogeneity of EHRs and varied algorithm implementations, we demonstrated the portability of this algorithm across three institutions. The performance of this algorithm for identifying DILI was comparable with other computerized approaches to identify adverse drug events. Conclusions Phenotyping algorithms developed for rare and complex conditions are likely to require adaptive implementation at multiple institutions. Better approaches are also needed to share algorithms. Early agreement on goals, data sources, and validation methods may improve the portability of the algorithms. PMID:23837993

  14. Developing a statistically powerful measure for quartet tree inference using phylogenetic identities and Markov invariants.

    PubMed

    Sumner, Jeremy G; Taylor, Amelia; Holland, Barbara R; Jarvis, Peter D

    2017-12-01

    Recently there has been renewed interest in phylogenetic inference methods based on phylogenetic invariants, alongside the related Markov invariants. Broadly speaking, both these approaches give rise to polynomial functions of sequence site patterns that, in expectation value, either vanish for particular evolutionary trees (in the case of phylogenetic invariants) or have well understood transformation properties (in the case of Markov invariants). While both approaches have been valued for their intrinsic mathematical interest, it is not clear how they relate to each other, and to what extent they can be used as practical tools for inference of phylogenetic trees. In this paper, by focusing on the special case of binary sequence data and quartets of taxa, we are able to view these two different polynomial-based approaches within a common framework. To motivate the discussion, we present three desirable statistical properties that we argue any invariant-based phylogenetic method should satisfy: (1) sensible behaviour under reordering of input sequences; (2) stability as the taxa evolve independently according to a Markov process; and (3) explicit dependence on the assumption of a continuous-time process. Motivated by these statistical properties, we develop and explore several new phylogenetic inference methods. In particular, we develop a statistically bias-corrected version of the Markov invariants approach which satisfies all three properties. We also extend previous work by showing that the phylogenetic invariants can be implemented in such a way as to satisfy property (3). A simulation study shows that, in comparison to other methods, our new proposed approach based on bias-corrected Markov invariants is extremely powerful for phylogenetic inference. The binary case is of particular theoretical interest as-in this case only-the Markov invariants can be expressed as linear combinations of the phylogenetic invariants. A wider implication of this is that, for models with more than two states-for example DNA sequence alignments with four-state models-we find that methods which rely on phylogenetic invariants are incapable of satisfying all three of the stated statistical properties. This is because in these cases the relevant Markov invariants belong to a class of polynomials independent from the phylogenetic invariants.

  15. Connection of Environmental Education with Application of Experiential Teaching Methods: A Case Study from Greece

    ERIC Educational Resources Information Center

    Koutsoukos, Marios; Fragoulis, Iosif; Valkanos, Euthimios

    2015-01-01

    The main objective of this case study is to examine secondary education teachers' opinions concerning the connection of environmental education with the use of experiential teaching methods. Exploring whether the application of experiential methods can upgrade the learning procedure, leading to a more holistic approach, the research focuses on…

  16. Exploring Barriers to Effective E-Learning: Case Study of DNPA

    ERIC Educational Resources Information Center

    Annansingh, Fenio; Bright, Ali

    2010-01-01

    Purpose: The purpose of this paper is to discuss a case study which examines and analyses a information communication technology training programme conducted using an e-learning platform at the Dartmoor National Park Authority, UK. Design/methodology/approach: The research adopted a mixed method approach which involved the use of questionnaires…

  17. Institutional Expansion: The Case of Grid Computing

    ERIC Educational Resources Information Center

    Kertcher, Zack

    2010-01-01

    Evolutionary and revolutionary approaches have dominated the study of scientific, technological and institutional change. Yet, being focused on change within a single field, these approaches have been mute about a third, pervasive process. This process is found in a variety of cases that range from open source software to the Monte Carlo method to…

  18. How and Why of User Studies: RLG's RedLightGreen as a Case Study

    ERIC Educational Resources Information Center

    Proffitt, Merrilee

    2006-01-01

    This article documents a lifecycle approach to employing user-centered design, covering both qualitative and quantitative data gathering methods in support of using this approach for product design, usability testing, and market research. The author provides specific case studies of usability studies, focus groups, interviews, ethnographic…

  19. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. [Methods for exposure of recurrent laryngeal nerve in thyroid surgery].

    PubMed

    Ma, Xiangdong; Han, Xilin; Liu, Tao; Kou, Changhua

    2014-10-01

    To evaluate different methods to explose recurrent laryngeal nerve (RLN) based on the location of thyroid diseses and anatomic path of the RLN, to avoid the RLN damage in thyroid surgery. A total of 755 cases underwent total lobectomy was studied retrospectively. RLN was explosed in each case. A total of 963 RLN was exposed in 755 cases, among those 658 RLN were exposed by lateral approach, 106 by inferior approach, and 199 by superior approach. It was showed that 694 RLN traveled deep to the inferior throid artery and 119 superficial to the artery, 98 through between two branches of the artery, and 62 with the cross of the nerve branches and the artery branches. Before entering larynx, 578 RLN gave off branches and 385 had no branches. Non-recurrent laryngeal nerves were found in 2 cases. There were 6 cases who presented with hoarseness after thyroidectomy and undergoing reexploration, among them RLN were legated in 4 cases and severed in 2 cases. The anatomic relation of RLN is relatively complicated. Lateral, inferior or superior aproach may be used for exposure of RLN to decrease risks of injury to the nerve.

  1. Fast approximate delivery of fluence maps for IMRT and VMAT

    NASA Astrophysics Data System (ADS)

    Balvert, Marleen; Craft, David

    2017-02-01

    In this article we provide a method to generate the trade-off between delivery time and fluence map matching quality for dynamically delivered fluence maps. At the heart of our method lies a mathematical programming model that, for a given duration of delivery, optimizes leaf trajectories and dose rates such that the desired fluence map is reproduced as well as possible. We begin with the single fluence map case and then generalize the model and the solution technique to the delivery of sequential fluence maps. The resulting large-scale, non-convex optimization problem was solved using a heuristic approach. We test our method using a prostate case and a head and neck case, and present the resulting trade-off curves. Analysis of the leaf trajectories reveals that short time plans have larger leaf openings in general than longer delivery time plans. Our method allows one to explore the continuum of possibilities between coarse, large segment plans characteristic of direct aperture approaches and narrow field plans produced by sliding window approaches. Exposing this trade-off will allow for an informed choice between plan quality and solution time. Further research is required to speed up the optimization process to make this method clinically implementable.

  2. A Case Study of the Degree of Collaboration Between Various Levels in the Reparable Chain in the United States Air Force

    DTIC Science & Technology

    2005-03-01

    qualitative research methods , a case study approach was selected to conduct this research . “A case study can be defined as an empirical study ... qualitative in nature, and also described the qualitative research method chosen as a case study . From 49 there, data collection was focused upon... qualitative nature of the research , a qualitative design was used to conduct the

  3. Evaluating the Impact of the U.S. National Toxicology Program: A Case Study on Hexavalent Chromium

    PubMed Central

    Xie, Yun; Holmgren, Stephanie; Andrews, Danica M. K.; Wolfe, Mary S.

    2016-01-01

    Background: Evaluating the impact of federally funded research with a broad, methodical, and objective approach is important to ensure that public funds advance the mission of federal agencies. Objectives: We aimed to develop a methodical approach that would yield a broad assessment of National Toxicology Program’s (NTP’s) effectiveness across multiple sectors and demonstrate the utility of the approach through a case study. Methods: A conceptual model was developed with defined activities, outputs (products), and outcomes (proximal, intermediate, distal) and applied retrospectively to NTP’s research on hexavalent chromium (CrVI). Proximal outcomes were measured by counting views of and requests for NTP’s products by external stakeholders. Intermediate outcomes were measured by bibliometric analysis. Distal outcomes were assessed through Web and LexisNexis searches for documents related to legislation or regulation changes. Results: The approach identified awareness of NTP’s work on CrVI by external stakeholders (proximal outcome) and citations of NTP’s research in scientific publications, reports, congressional testimonies, and legal and policy documents (intermediate outcome). NTP’s research was key to the nation’s first-ever drinking water standard for CrVI adopted by California in 2014 (distal outcome). By applying this approach to a case study, the utility and limitations of the approach were identified, including challenges to evaluating the outcomes of a research program. Conclusions: This study identified a broad and objective approach for assessing NTP’s effectiveness, including methodological needs for more thorough and efficient impact assessments in the future. Citation: Xie Y, Holmgren S, Andrews DMK, Wolfe MS. 2017. Evaluating the impact of the U.S. National Toxicology Program: a case study on hexavalent chromium. Environ Health Perspect 125:181–188; http://dx.doi.org/10.1289/EHP21 PMID:27483499

  4. Improvement of Coordination in the Multi-National Military Coordination Center of the Nepal Army in Respond to Disasters

    DTIC Science & Technology

    2017-06-09

    primary question. This thesis has used the case study research methodology with Capability-Based Assessment (CBA) approach. My engagement in this...protected by more restrictions in their home countries, in which case further publication or sale of copyrighted images is not permissible...effective coordinating mechanism. The research follows the case study method utilizing the Capability Based Analysis (CBA) approach to scrutinize the

  5. BETASEQ: a powerful novel method to control type-I error inflation in partially sequenced data for rare variant association testing.

    PubMed

    Yan, Song; Li, Yun

    2014-02-15

    Despite its great capability to detect rare variant associations, next-generation sequencing is still prohibitively expensive when applied to large samples. In case-control studies, it is thus appealing to sequence only a subset of cases to discover variants and genotype the identified variants in controls and the remaining cases under the reasonable assumption that causal variants are usually enriched among cases. However, this approach leads to inflated type-I error if analyzed naively for rare variant association. Several methods have been proposed in recent literature to control type-I error at the cost of either excluding some sequenced cases or correcting the genotypes of discovered rare variants. All of these approaches thus suffer from certain extent of information loss and thus are underpowered. We propose a novel method (BETASEQ), which corrects inflation of type-I error by supplementing pseudo-variants while keeps the original sequence and genotype data intact. Extensive simulations and real data analysis demonstrate that, in most practical situations, BETASEQ leads to higher testing powers than existing approaches with guaranteed (controlled or conservative) type-I error. BETASEQ and associated R files, including documentation, examples, are available at http://www.unc.edu/~yunmli/betaseq

  6. Innovations in the Teaching of Behavioral Sciences in the Preclinical Curriculum

    ERIC Educational Resources Information Center

    Mack, Kevin

    2005-01-01

    Objective: In problem-based learning curricula, cases are usually clustered into identified themes or organ systems. While this method of aggregating cases presents clear advantages in terms of resource alignment and student focus, an alternative "hidden cluster" approach provides rich opportunities for content integration. Method: The author…

  7. Laryngopyocele: report of a rare case and systematic review.

    PubMed

    Al-Yahya, Syarifah N; Baki, Marina Mat; Saad, Sakina Mohd; Azman, Mawaddah; Mohamad, Abdullah Sani

    2016-01-01

    A systematic review on laryngopyoceles across Ovid, PubMed, and Google Scholar database was conducted. A total of 61 papers published between 1952 and 2015 were found. Of these, 23 cases written in English, which described the number of cases, surgical approaches, resort to tracheostomy, complications, and outcomes, were shortlisted. Four cases of laryngopyoceles were managed endoscopically using a cold instrument, microdebrider, or laser. Eighteen cases were operated via an external approach, and 1 case applied both approaches. One of 4 endoscopic and 10 of 18 external approaches involved tracheostomy. The present study aimed to report a case of large mixed laryngopyocele that was successfully drained and marsupialized endoscopically using suction diathermy without requiring tracheostomy. Management using suction diathermy for excision and marsupialization of a laryngopyocele has never been reported and can be recommended as a feasible method due to its widespread availability. In the presence of a large laryngopyocele impeding the airway, tracheostomy may be averted in a controlled setting. None specified.

  8. A Novel Method of Case Representation and Retrieval in CBR for E-Learning

    ERIC Educational Resources Information Center

    Khamparia, Aditya; Pandey, Babita

    2017-01-01

    In this paper we have discussed a novel method which has been developed for representation and retrieval of cases in case based reasoning (CBR) as a part of e-learning system which is based on various student features. In this approach we have integrated Artificial Neural Network (ANN) with Data mining (DM) and CBR. ANN is used to find the…

  9. The Physics of Music with Interdisciplinary Approach: A Case of Prospective Music Teachers

    ERIC Educational Resources Information Center

    Turna, Özge; Bolat, Mualla

    2016-01-01

    Physics of music is an area that is covered by interdisciplinary approach. In this study it is aimed to determine prospective music teachers' level of association with physics concepts which are related to music. The research is a case study which combines qualitative and quantitative methods. Eighty-four students who were studying at the…

  10. Ischemic stroke enhancement in computed tomography scans using a computational approach

    NASA Astrophysics Data System (ADS)

    Alves, Allan F. F.; Pavan, Ana L. M.; Jennane, Rachid; Miranda, José R. A.; Freitas, Carlos C. M.; Abdala, Nitamar; Pina, Diana R.

    2018-03-01

    In this work, a novel approach was proposed to enhance the visual perception of ischemic stroke in computed tomography scans. Through different image processing techniques, we enabled less experienced physicians, to reliably detect early signs of stroke. A set of 40 retrospective CT scans of patients were used, divided into two groups: 25 cases of acute ischemic stroke and 15 normal cases used as control group. All cases were obtained within 4 hours of symptoms onset. Our approach was based on the variational decomposition model and three different segmentation methods. A test determined observers' performance to correctly diagnose stroke cases. The Expectation Maximization method provided the best results among all observers. The overall sensitivity of the observer's analysis was 64% and increased to 79%. The overall specificity was 67% and increased to 78%. These results show the importance of a computational tool to assist neuroradiology decisions, especially in critical situations such as the diagnosis of ischemic stroke.

  11. Comparative study: TQ and Lean Production ownership models in health services

    PubMed Central

    Eiro, Natalia Yuri; Torres-Junior, Alvair Silveira

    2015-01-01

    Objective: compare the application of Total Quality (TQ) models used in processes of a health service, cases of lean healthcare and literature from another institution that has also applied this model. Method: this is a qualitative research that was conducted through a descriptive case study. Results: through critical analysis of the institutions studied it was possible to make a comparison between the traditional quality approach checked in one case and the theoretical and practice lean production approach used in another case and the specifications are described below. Conclusion: the research identified that the lean model was better suited for people that work systemically and generate the flow. It also pointed towards some potential challenges in the introduction and implementation of lean methods in health. PMID:26487134

  12. An integrated lean-methods approach to hospital facilities redesign.

    PubMed

    Nicholas, John

    2012-01-01

    Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.

  13. Theory of the dynamical thermal conductivity of metals

    NASA Astrophysics Data System (ADS)

    Bhalla, Pankaj; Kumar, Pradeep; Das, Nabyendu; Singh, Navinder

    2016-09-01

    The Mori's projection method, known as the memory function method, is an important theoretical formalism to study various transport coefficients. In the present work, we calculate the dynamical thermal conductivity in the case of metals using the memory function formalism. We introduce thermal memory functions for the first time and discuss the behavior of thermal conductivity in both the zero frequency limit and in the case of nonzero frequencies. We compare our results for the zero frequency case with the results obtained by the Bloch-Boltzmann kinetic approach and find that both approaches agree with each other. Motivated by some recent experimental advancements, we obtain several new results for the ac or the dynamical thermal conductivity.

  14. Laparoscopic Approach for Metachronous Cecal and Sigmoid Volvulus

    PubMed Central

    Greenstein, Alexander J.; Zisman, Sharon R.

    2010-01-01

    Background: Metachronous colonic volvulus is a rare event that has never been approached laparoscopically. Methods: Here we discuss the case of a 63-year-old female with a metachronous sigmoid and cecal volvulus. Results: The patient underwent 2 separate successful laparoscopic resections. Discussion and Conclusion: The following is a discussion of the case and the laparoscopic technique, accompanied by a brief review of colonic volvulus. In experienced hands, laparoscopy is a safe approach for acute colonic volvulus. PMID:21605523

  15. Beyond Compliance Checking: A Situated Approach to Visual Research Ethics.

    PubMed

    Lenette, Caroline; Botfield, Jessica R; Boydell, Katherine; Haire, Bridget; Newman, Christy E; Zwi, Anthony B

    2018-03-19

    Visual research methods like photography and digital storytelling are increasingly used in health and social sciences research as participatory approaches that benefit participants, researchers, and audiences. Visual methods involve a number of additional ethical considerations such as using identifiable content and ownership of creative outputs. As such, ethics committees should use different assessment frameworks to consider research protocols with visual methods. Here, we outline the limitations of ethics committees in assessing projects with a visual focus and highlight the sparse knowledge on how researchers respond when they encounter ethical challenges in the practice of visual research. We propose a situated approach in relation to visual methodologies that encompasses a negotiated, flexible approach, given that ethical issues usually emerge in relation to the specific contexts of individual research projects. Drawing on available literature and two case studies, we identify and reflect on nuanced ethical implications in visual research, like tensions between aesthetics and research validity. The case studies highlight strategies developed in-situ to address the challenges two researchers encountered when using visual research methods, illustrating that some practice implications are not necessarily addressed using established ethical clearance procedures. A situated approach can ensure that visual research remains ethical, engaging, and rigorous.

  16. A three-dimensional quality-guided phase unwrapping method for MR elastography

    NASA Astrophysics Data System (ADS)

    Wang, Huifang; Weaver, John B.; Perreard, Irina I.; Doyley, Marvin M.; Paulsen, Keith D.

    2011-07-01

    Magnetic resonance elastography (MRE) uses accumulated phases that are acquired at multiple, uniformly spaced relative phase offsets, to estimate harmonic motion information. Heavily wrapped phase occurs when the motion is large and unwrapping procedures are necessary to estimate the displacements required by MRE. Two unwrapping methods were developed and compared in this paper. The first method is a sequentially applied approach. The three-dimensional MRE phase image block for each slice was processed by two-dimensional unwrapping followed by a one-dimensional phase unwrapping approach along the phase-offset direction. This unwrapping approach generally works well for low noise data. However, there are still cases where the two-dimensional unwrapping method fails when noise is high. In this case, the baseline of the corrupted regions within an unwrapped image will not be consistent. Instead of separating the two-dimensional and one-dimensional unwrapping in a sequential approach, an interleaved three-dimensional quality-guided unwrapping method was developed to combine both the two-dimensional phase image continuity and one-dimensional harmonic motion information. The quality of one-dimensional harmonic motion unwrapping was used to guide the three-dimensional unwrapping procedures and it resulted in stronger guidance than in the sequential method. In this work, in vivo results generated by the two methods were compared.

  17. Meta-Synthetic Support Frameworks for Reuse of Government Information Resources on City Travel and Traffic: The Case of Beijing

    ERIC Educational Resources Information Center

    An, Xiaomi; Xu, Shaotong; Mu, Yong; Wang, Wei; Bai, Xian Yang; Dawson, Andy; Han, Hongqi

    2012-01-01

    Purpose: The purpose of this paper is to propose meta-synthetic ideas and knowledge asset management approaches to build a comprehensive strategic framework for Beijing City in China. Design/methodology/approach: Methods include a review of relevant literature in both English and Chinese, case studies of different types of support frameworks in…

  18. Implementing the Project Approach: A Case Study of Hybrid Pedagogy in a Hong Kong Kindergarten

    ERIC Educational Resources Information Center

    Chen, Jennifer J.; Li, Hui; Wang, Jing-ying

    2017-01-01

    The Project Approach has been promoted in Hong Kong kindergartens since the 1990s. However, the dynamic processes and underlying mechanisms involved in the teachers' implementation of this pedagogical method there have not yet been fully investigated. This case study of one typical kindergarten in Hong Kong documented how and why eight teachers…

  19. A Qualitative Case Study Approach To Examine Information Resources Management. (Utilisation d'une Approche Qualitative par Methode de cas pour Etudier la Gestion des Ressources D'information).

    ERIC Educational Resources Information Center

    Bergeron, Pierrette

    1997-01-01

    Illustrates how a qualitative approach was used to study the complex and poorly defined concept of information resources management. Explains the general approach to data collection, its advantages and limitations, and the process used to analyze the data. Presents results, along with lessons learned through using method. (Author/AEF)

  20. Simple F Test Reveals Gene-Gene Interactions in Case-Control Studies

    PubMed Central

    Chen, Guanjie; Yuan, Ao; Zhou, Jie; Bentley, Amy R.; Adeyemo, Adebowale; Rotimi, Charles N.

    2012-01-01

    Missing heritability is still a challenge for Genome Wide Association Studies (GWAS). Gene-gene interactions may partially explain this residual genetic influence and contribute broadly to complex disease. To analyze the gene-gene interactions in case-control studies of complex disease, we propose a simple, non-parametric method that utilizes the F-statistic. This approach consists of three steps. First, we examine the joint distribution of a pair of SNPs in cases and controls separately. Second, an F-test is used to evaluate the ratio of dependence in cases to that of controls. Finally, results are adjusted for multiple tests. This method was used to evaluate gene-gene interactions that are associated with risk of Type 2 Diabetes among African Americans in the Howard University Family Study. We identified 18 gene-gene interactions (P < 0.0001). Compared with the commonly-used logistical regression method, we demonstrate that the F-ratio test is an efficient approach to measuring gene-gene interactions, especially for studies with limited sample size. PMID:22837643

  1. Applying Real Options for Evaluating Investments in ERP Systems

    NASA Astrophysics Data System (ADS)

    Nakagane, Jun; Sekozawa, Teruji

    This paper intends to verify effectiveness of real options approach for evaluating investments in Enterprise Resource Planning systems (ERP) and proves how important it is to disclose shadow options potentially embedded in ERP investment. The net present value (NPV) method is principally adopted to evaluate the value of ERP. However, the NPV method assumes no uncertainties exist in the object. It doesn't satisfy the current business circumstances which are filled with dynamic issues. Since the 1990s the effectiveness of option pricing models for Information System (IS) investment to solve issues in the NPV method has been discussed in the IS literature. This paper presents 3 business cases to review the practical advantages of such techniques for IS investments, especially ERP investments. The first case is EDI development. We evaluate the project by a new approach with lighting one of shadow options, EDI implementation. In the second case we reveal an ERP investment has an “expanding option” in a case of eliminating redundancy. The third case describes an option to contract which is deliberately slotted in ERP development to prepare transferring a manufacturing facility.

  2. Inferring imagined speech using EEG signals: a new approach using Riemannian manifold features

    NASA Astrophysics Data System (ADS)

    Nguyen, Chuong H.; Karavas, George K.; Artemiadis, Panagiotis

    2018-02-01

    Objective. In this paper, we investigate the suitability of imagined speech for brain-computer interface (BCI) applications. Approach. A novel method based on covariance matrix descriptors, which lie in Riemannian manifold, and the relevance vector machines classifier is proposed. The method is applied on electroencephalographic (EEG) signals and tested in multiple subjects. Main results. The method is shown to outperform other approaches in the field with respect to accuracy and robustness. The algorithm is validated on various categories of speech, such as imagined pronunciation of vowels, short words and long words. The classification accuracy of our methodology is in all cases significantly above chance level, reaching a maximum of 70% for cases where we classify three words and 95% for cases of two words. Significance. The results reveal certain aspects that may affect the success of speech imagery classification from EEG signals, such as sound, meaning and word complexity. This can potentially extend the capability of utilizing speech imagery in future BCI applications. The dataset of speech imagery collected from total 15 subjects is also published.

  3. Who's in and why? A typology of stakeholder analysis methods for natural resource management.

    PubMed

    Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C

    2009-04-01

    Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.

  4. A Tale of Two Methods: Chart and Interview Methods for Identifying Delirium

    PubMed Central

    Saczynski, Jane S.; Kosar, Cyrus M.; Xu, Guoquan; Puelle, Margaret R.; Schmitt, Eva; Jones, Richard N.; Marcantonio, Edward R.; Wong, Bonnie; Isaza, Ilean; Inouye, Sharon K.

    2014-01-01

    Background Interview and chart-based methods for identifying delirium have been validated. However, relative strengths and limitations of each method have not been described, nor has a combined approach (using both interviews and chart), been systematically examined. Objectives To compare chart and interview-based methods for identification of delirium. Design, Setting and Participants Participants were 300 patients aged 70+ undergoing major elective surgery (majority were orthopedic surgery) interviewed daily during hospitalization for delirium using the Confusion Assessment Method (CAM; interview-based method) and whose medical charts were reviewed for delirium using a validated chart-review method (chart-based method). We examined rate of agreement on the two methods and patient characteristics of those identified using each approach. Predictive validity for clinical outcomes (length of stay, postoperative complications, discharge disposition) was compared. In the absence of a gold-standard, predictive value could not be calculated. Results The cumulative incidence of delirium was 23% (n= 68) by the interview-based method, 12% (n=35) by the chart-based method and 27% (n=82) by the combined approach. Overall agreement was 80%; kappa was 0.30. The methods differed in detection of psychomotor features and time of onset. The chart-based method missed delirium in CAM-identified patients laacking features of psychomotor agitation or inappropriate behavior. The CAM-based method missed chart-identified cases occurring during the night shift. The combined method had high predictive validity for all clinical outcomes. Conclusions Interview and chart-based methods have specific strengths for identification of delirium. A combined approach captures the largest number and the broadest range of delirium cases. PMID:24512042

  5. Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.

    PubMed

    Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio

    2010-03-26

    Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.

  6. Induction of Lactation in the Biological Mother After Gestational Surrogacy of Twins: A Novel Approach and Review of Literature.

    PubMed

    Farhadi, Roya; Philip, Roy K

    One of the important challenges in surrogate pregnancies is the early bonding of genetic mother with her infant and the establishment of breastfeeding. A combination of pharmacological and nonpharmacological methods is often used for the induction of lactation. Reports of induced lactation in surrogacy are limited and scattered. In this report, we present a case of induced lactation and initiation of breastfeeding in preterm twins by the genetic mother, through her novel approach after a gestational surrogate pregnancy. Thematic approach of maternal account is summarized with context and rigor. We reviewed the reported literature of induced lactation in similar cases with an aim to address the various methods adopted.

  7. Improved Bone Graft Method for Upper Cervical Surgery with Posterior Approach: Technical Description and Report of 52 Cases.

    PubMed

    Wang, Yong-Li; Wang, Xiang-Yang

    2018-06-01

    We sought to report a minimum 12 months' follow-up results of our improved bone graft method for upper cervical surgery with the posterior approach. Among 52 consecutive cases, odontoid nonunion occurred in 33 patients, atlantoaxial instability in 11 patients, and occipitocervical deformity in 8 patients who underwent posterior C1-C2 transarticular screw/screw-rod internal fixation (41 cases) and occipitocervical fusion (11 cases) with the improved bone graft technique. Each surgical procedure was performed by the same senior spine surgeon. We took lateral cervical standing roentgenograms before surgery and immediately after surgery. Then we conducted craniocerebral computed tomography examination with reconstruction at 3, 6, 12, and 24 months and annually thereafter. The postoperative follow-up times are about 12-38 months. All cases showed satisfactory screw fixation by radiographic examination, and there were no postoperative neurologic complications. One case had postoperative retropharyngeal infection after the transoral release and posterior reduction by pedicle screw instrumentation. All patients got solid fusions, and no pseudarthrosis occurred. All cases had solid fusions at the 3-month follow-up. Good bone graft bed, enough bone graft material, solid local fixation, and effective bone graft method are prerequisites for a successful bone graft. By analyzing postoperative follow-up in the consecutive cases in this study, our bone graft method describing a new bone graft structure is a reliable posterior fusion technique. It is worth considering, and further research is needed. Copyright © 2018. Published by Elsevier Inc.

  8. An investigative framework to facilitate epidemiological thinking during herd problem-solving.

    PubMed

    More, Simon J; Doherty, Michael L; O'Grady, Luke

    2017-01-01

    Veterinary clinicians and students commonly use diagnostic approaches appropriate for individual cases when conducting herd problem-solving. However, these approaches can be problematic, in part because they make limited use of epidemiological principles and methods, which has clear application during the investigation of herd problems. In this paper, we provide an overview of diagnostic approaches that are used when investigating individual animal cases, and the challenges faced when these approaches are directly translated from the individual to the herd. Further, we propose an investigative framework to facilitate epidemiological thinking during herd problem-solving. A number of different approaches are used when making a diagnosis on an individual animal, including pattern recognition, hypothetico-deductive reasoning, and the key abnormality method. Methods commonly applied to individuals are often adapted for herd problem-solving: 'comparison with best practice' being a herd-level adaptation of pattern recognition, and 'differential diagnoses' a herd-level adaptation of hypothetico-deductive reasoning. These approaches can be effective, however, challenges can arise. Herds are complex; a collection of individual cows, but also additional layers relating to environment, management, feeding etc. It is unrealistic to expect seamless translation of diagnostic approaches from the individual to the herd. Comparison with best practice is time-consuming and prioritisation of actions can be problematic, whereas differential diagnoses can lead to 'pathogen hunting', particularly in complex cases. Epidemiology is the science of understanding disease in populations. The focus is on the population, underpinned by principles and utilising methods that seek to allow us to generate solid conclusions from apparently uncontrolled situations. In this paper, we argue for the inclusion of epidemiological principles and methods as an additional tool for herd problem-solving, and outline an investigative framework, with examples, to effectively incorporate these principles and methods with other diagnostic approaches during herd problem-solving. Relevant measures of performance are identified, and measures of case frequencies are calculated and compared across time, in space and among animal groupings, to identify patterns, clues and plausible hypotheses, consistent with potential biological processes. With this knowledge, the subsequent investigation (relevant on-farm activities, diagnostic testing and other examinations) can be focused, and actions prioritised (specifically, those actions that are likely to make the greatest difference in addressing the problem if enacted). In our experience, this investigative framework is an effective teaching tool, facilitating epidemiological thinking among students during herd problem-solving. It is a generic and robust process, suited to many herd-based problems.

  9. Improvement of a stability-indicating method by Quality-by-Design versus Quality-by-Testing: a case of a learning process.

    PubMed

    Hubert, C; Lebrun, P; Houari, S; Ziemons, E; Rozet, E; Hubert, Ph

    2014-01-01

    The understanding of the method is a major concern when developing a stability-indicating method and even more so when dealing with impurity assays from complex matrices. In the presented case study, a Quality-by-Design approach was applied in order to optimize a routinely used method. An analytical issue occurring at the last stage of a long-term stability study involving unexpected impurities perturbing the monitoring of characterized impurities needed to be resolved. A compliant Quality-by-Design (QbD) methodology based on a Design of Experiments (DoE) approach was evaluated within the framework of a Liquid Chromatography (LC) method. This approach allows the investigation of Critical Process Parameters (CPPs), which have an impact on Critical Quality Attributes (CQAs) and, consequently, on LC selectivity. Using polynomial regression response modeling as well as Monte Carlo simulations for error propagation, Design Space (DS) was computed in order to determine robust working conditions for the developed stability-indicating method. This QbD compliant development was conducted in two phases allowing the use of the Design Space knowledge acquired during the first phase to define the experimental domain of the second phase, which constitutes a learning process. The selected working condition was then fully validated using accuracy profiles based on statistical tolerance intervals in order to evaluate the reliability of the results generated by this LC/ESI-MS stability-indicating method. A comparison was made between the traditional Quality-by-Testing (QbT) approach and the QbD strategy, highlighting the benefit of this QbD strategy in the case of an unexpected impurities issue. On this basis, the advantages of a systematic use of the QbD methodology were discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. An Adaptive Unstructured Grid Method by Grid Subdivision, Local Remeshing, and Grid Movement

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    1999-01-01

    An unstructured grid adaptation technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The approach is based on a combination of grid subdivision, local remeshing, and grid movement. For solution adaptive grids, the surface triangulation is locally refined by grid subdivision, and the tetrahedral grid in the field is partially remeshed at locations of dominant flow features. A grid redistribution strategy is employed for geometric adaptation of volume grids to moving or deforming surfaces. The method is automatic and fast and is designed for modular coupling with different solvers. Several steady state test cases with different inviscid flow features were tested for grid/solution adaptation. In all cases, the dominant flow features, such as shocks and vortices, were accurately and efficiently predicted with the present approach. A new and robust method of moving tetrahedral "viscous" grids is also presented and demonstrated on a three-dimensional example.

  11. Charting the Learning Journey of a Group of Adults Returning to Education

    ERIC Educational Resources Information Center

    Mooney, Des

    2011-01-01

    Using a qualitative case study method the researcher studied a group of adult returning students completing a childcare course. Methods used included focus groups, a questionnaire and observations. Using a holistic analysis approach (Yin 2003) of the case the researcher then focused on a number of key issues. From this analysis the themes of…

  12. Functional Assessment-Based Interventions for Students with or At-Risk for High-Incidence Disabilities: Field Testing Single-Case Synthesis Methods

    ERIC Educational Resources Information Center

    Common, Eric Alan; Lane, Kathleen Lynne; Pustejovsky, James E.; Johnson, Austin H.; Johl, Liane Elizabeth

    2017-01-01

    This systematic review investigated one systematic approach to designing, implementing, and evaluating functional assessment-based interventions (FABI) for use in supporting school-age students with or at-risk for high-incidence disabilities. We field tested several recently developed methods for single-case design syntheses. First, we appraised…

  13. [A new methodological approach for leptospira persistence studies in case of mixed leptospirosis].

    PubMed

    Samsonova, A P; Petrov, E M; Vyshivkina, N V; Anan'ina, Iu V

    2003-01-01

    A new methodical approach for Leptospira persistence studies in case of mixed leptospirosis, based on the use of PCR test systems with different taxonomic specificity for the indication and identification of leptospires, was developed. Two PCR test systems (G and B) were used in experiments on BALB/c white mice to study patterns of the development of mixed infection caused by leptospires of serovar poi (genomospecies L. borgpeterseni) and grippotyphosa (genomospecies L. kirschneri). The conclusion was made of good prospects of this method application in studies on symbiotic relationships of leptospires both in vivo and in vitro.

  14. A simple, stable, and accurate linear tetrahedral finite element for transient, nearly, and fully incompressible solid dynamics: A dynamic variational multiscale approach [A simple, stable, and accurate tetrahedral finite element for transient, nearly incompressible, linear and nonlinear elasticity: A dynamic variational multiscale approach

    DOE PAGES

    Scovazzi, Guglielmo; Carnes, Brian; Zeng, Xianyi; ...

    2015-11-12

    Here, we propose a new approach for the stabilization of linear tetrahedral finite elements in the case of nearly incompressible transient solid dynamics computations. Our method is based on a mixed formulation, in which the momentum equation is complemented by a rate equation for the evolution of the pressure field, approximated with piece-wise linear, continuous finite element functions. The pressure equation is stabilized to prevent spurious pressure oscillations in computations. Incidentally, it is also shown that many stabilized methods previously developed for the static case do not generalize easily to transient dynamics. Extensive tests in the context of linear andmore » nonlinear elasticity are used to corroborate the claim that the proposed method is robust, stable, and accurate.« less

  15. A simple, stable, and accurate linear tetrahedral finite element for transient, nearly, and fully incompressible solid dynamics: A dynamic variational multiscale approach [A simple, stable, and accurate tetrahedral finite element for transient, nearly incompressible, linear and nonlinear elasticity: A dynamic variational multiscale approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scovazzi, Guglielmo; Carnes, Brian; Zeng, Xianyi

    Here, we propose a new approach for the stabilization of linear tetrahedral finite elements in the case of nearly incompressible transient solid dynamics computations. Our method is based on a mixed formulation, in which the momentum equation is complemented by a rate equation for the evolution of the pressure field, approximated with piece-wise linear, continuous finite element functions. The pressure equation is stabilized to prevent spurious pressure oscillations in computations. Incidentally, it is also shown that many stabilized methods previously developed for the static case do not generalize easily to transient dynamics. Extensive tests in the context of linear andmore » nonlinear elasticity are used to corroborate the claim that the proposed method is robust, stable, and accurate.« less

  16. Advanced scatter search approach and its application in a sequencing problem of mixed-model assembly lines in a case company

    NASA Astrophysics Data System (ADS)

    Liu, Qiong; Wang, Wen-xi; Zhu, Ke-ren; Zhang, Chao-yong; Rao, Yun-qing

    2014-11-01

    Mixed-model assembly line sequencing is significant in reducing the production time and overall cost of production. To improve production efficiency, a mathematical model aiming simultaneously to minimize overtime, idle time and total set-up costs is developed. To obtain high-quality and stable solutions, an advanced scatter search approach is proposed. In the proposed algorithm, a new diversification generation method based on a genetic algorithm is presented to generate a set of potentially diverse and high-quality initial solutions. Many methods, including reference set update, subset generation, solution combination and improvement methods, are designed to maintain the diversification of populations and to obtain high-quality ideal solutions. The proposed model and algorithm are applied and validated in a case company. The results indicate that the proposed advanced scatter search approach is significant for mixed-model assembly line sequencing in this company.

  17. Using Case Studies: An International Approach

    ERIC Educational Resources Information Center

    McClam, Tricia; Woodside, Marianne

    2005-01-01

    Case studies as an instructional strategy have been used in many disciplines, including law, teacher education, science, medicine, and business. Among the benefits of this method of instruction are involving students in learning, developing their critical thinking skills, promoting communication, and engaging in critical analysis. Case studies are…

  18. Comparing Management Approaches for Automatic Test Systems: A Strategic Missile Case Study

    DTIC Science & Technology

    2005-03-01

    ground up, and is commonly conducted following five methods : ethnography , grounded theory , case study , phenomenological study , and biography...traditions frequently used (Creswell, 1998:5). The five traditions are biography, phenomenological study , grounded theory study , ethnography , and... Ethnography Biography Case Study Grounded Theory

  19. Feminist Approaches to Triangulation: Uncovering Subjugated Knowledge and Fostering Social Change in Mixed Methods Research

    ERIC Educational Resources Information Center

    Hesse-Biber, Sharlene

    2012-01-01

    This article explores the deployment of triangulation in the service of uncovering subjugated knowledge and promoting social change for women and other oppressed groups. Feminist approaches to mixed methods praxis create a tight link between the research problem and the research design. An analysis of selected case studies of feminist praxis…

  20. Multiple Enactments of Method, Divergent Hinterlands and Production of Multiple Realities in Educational Research

    ERIC Educational Resources Information Center

    Rimpiläinen, Sanna

    2015-01-01

    What do different research methods and approaches "do" in practice? The article seeks to discuss this point by drawing upon socio-material research approaches and empirical examples taken from the early stages of an extensive case study on an interdisciplinary project between two multidisciplinary fields of study, education and computer…

  1. Proposing a sequential comparative analysis for assessing multilateral health agency transformation and sustainable capacity: exploring the advantages of institutional theory

    PubMed Central

    2014-01-01

    Background This article proposes an approach to comparing and assessing the adaptive capacity of multilateral health agencies in meeting country and individual healthcare needs. Most studies comparing multilateral health agencies have failed to clearly propose a method for conducting agency comparisons. Methods This study conducted a qualitative case study methodological approach, such that secondary and primary case study literature was used to conduct case study comparisons of multilateral health agencies. Results Through the proposed Sequential Comparative Analysis (SCA), the author found a more effective way to justify the selection of cases, compare and assess organizational transformative capacity, and to learn from agency success in policy sustainability processes. Conclusions To more affectively understand and explain why some multilateral health agencies are more capable of adapting to country and individual healthcare needs, SCA provides a methodological approach that may help to better understand why these agencies are so different and what we can learn from successful reform processes. As funding challenges continue to hamper these agencies' adaptive capacity, learning from each other will become increasingly important. PMID:24886283

  2. Masked Visual Analysis: Minimizing Type I Error in Visually Guided Single-Case Design for Communication Disorders

    PubMed Central

    Hitchcock, Elaine R.; Ferron, John

    2017-01-01

    Purpose Single-case experimental designs are widely used to study interventions for communication disorders. Traditionally, single-case experiments follow a response-guided approach, where design decisions during the study are based on participants' observed patterns of behavior. However, this approach has been criticized for its high rate of Type I error. In masked visual analysis (MVA), response-guided decisions are made by a researcher who is blinded to participants' identities and treatment assignments. MVA also makes it possible to conduct a hypothesis test assessing the significance of treatment effects. Method This tutorial describes the principles of MVA, including both how experiments can be set up and how results can be used for hypothesis testing. We then report a case study showing how MVA was deployed in a multiple-baseline across-subjects study investigating treatment for residual errors affecting rhotics. Strengths and weaknesses of MVA are discussed. Conclusions Given their important role in the evidence base that informs clinical decision making, it is critical for single-case experimental studies to be conducted in a way that allows researchers to draw valid inferences. As a method that can increase the rigor of single-case studies while preserving the benefits of a response-guided approach, MVA warrants expanded attention from researchers in communication disorders. PMID:28595354

  3. Effect of costing methods on unit cost of hospital medical services.

    PubMed

    Riewpaiboon, Arthorn; Malaroje, Saranya; Kongsawatt, Sukalaya

    2007-04-01

    To explore the variance of unit costs of hospital medical services due to different costing methods employed in the analysis. Retrospective and descriptive study at Kaengkhoi District Hospital, Saraburi Province, Thailand, in the fiscal year 2002. The process started with a calculation of unit costs of medical services as a base case. After that, the unit costs were re-calculated based on various methods. Finally, the variations of the results obtained from various methods and the base case were computed and compared. The total annualized capital cost of buildings and capital items calculated by the accounting-based approach (averaging the capital purchase prices throughout their useful life) was 13.02% lower than that calculated by the economic-based approach (combination of depreciation cost and interest on undepreciated portion over the useful life). A change of discount rate from 3% to 6% results in a 4.76% increase of the hospital's total annualized capital cost. When the useful life of durable goods was changed from 5 to 10 years, the total annualized capital cost of the hospital decreased by 17.28% from that of the base case. Regarding alternative criteria of indirect cost allocation, unit cost of medical services changed by a range of -6.99% to +4.05%. We explored the effect on unit cost of medical services in one department. Various costing methods, including departmental allocation methods, ranged between -85% and +32% against those of the base case. Based on the variation analysis, the economic-based approach was suitable for capital cost calculation. For the useful life of capital items, appropriate duration should be studied and standardized. Regarding allocation criteria, single-output criteria might be more efficient than the combined-output and complicated ones. For the departmental allocation methods, micro-costing method was the most suitable method at the time of study. These different costing methods should be standardized and developed as guidelines since they could affect implementation of the national health insurance scheme and health financing management.

  4. Open-Ended Cases in Agroecology: Farming and Food Systems in the Nordic Region and the US Midwest

    ERIC Educational Resources Information Center

    Francis, Charles; King, James; Lieblein, Geir; Breland, Tor Arvid; Salomonsson, Lennart; Sriskandarajah, Nadarajah; Porter, Paul; Wiedenhoeft, Mary

    2009-01-01

    Our aim is to describe open-ended case studies for learning real-life problem solving skills, and relate this approach to conventional, closed-ended decision case studies. Teaching methods are open-ended cases in agroecology, an alternative to traditional strategies that lead students through prepared materials and structured discussions to…

  5. Bewitching Ideas Influence Learning: An Evaluation of an Interdisciplinary Teaching Experience

    ERIC Educational Resources Information Center

    Nava-Whitehead, Susan M.; Augusto, Kerri W.; Gow, Joan-Beth

    2011-01-01

    This column provides original articles on innovations in case study teaching, assessment of the method, as well as case studies with teaching notes. In this month's issue the authors describe an interdisciplinary approach to case study teaching that addresses the demand to balance the goals of process and content. The case study, Salem's Secrets…

  6. The clustering-based case-based reasoning for imbalanced business failure prediction: a hybrid approach through integrating unsupervised process with supervised process

    NASA Astrophysics Data System (ADS)

    Li, Hui; Yu, Jun-Ling; Yu, Le-An; Sun, Jie

    2014-05-01

    Case-based reasoning (CBR) is one of the main forecasting methods in business forecasting, which performs well in prediction and holds the ability of giving explanations for the results. In business failure prediction (BFP), the number of failed enterprises is relatively small, compared with the number of non-failed ones. However, the loss is huge when an enterprise fails. Therefore, it is necessary to develop methods (trained on imbalanced samples) which forecast well for this small proportion of failed enterprises and performs accurately on total accuracy meanwhile. Commonly used methods constructed on the assumption of balanced samples do not perform well in predicting minority samples on imbalanced samples consisting of the minority/failed enterprises and the majority/non-failed ones. This article develops a new method called clustering-based CBR (CBCBR), which integrates clustering analysis, an unsupervised process, with CBR, a supervised process, to enhance the efficiency of retrieving information from both minority and majority in CBR. In CBCBR, various case classes are firstly generated through hierarchical clustering inside stored experienced cases, and class centres are calculated out by integrating cases information in the same clustered class. When predicting the label of a target case, its nearest clustered case class is firstly retrieved by ranking similarities between the target case and each clustered case class centre. Then, nearest neighbours of the target case in the determined clustered case class are retrieved. Finally, labels of the nearest experienced cases are used in prediction. In the empirical experiment with two imbalanced samples from China, the performance of CBCBR was compared with the classical CBR, a support vector machine, a logistic regression and a multi-variant discriminate analysis. The results show that compared with the other four methods, CBCBR performed significantly better in terms of sensitivity for identifying the minority samples and generated high total accuracy meanwhile. The proposed approach makes CBR useful in imbalanced forecasting.

  7. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  8. Political Science, The Judicial Process, and A Legal Education

    ERIC Educational Resources Information Center

    Funston, Richard

    1975-01-01

    Application of the behavioral approach to the study of the judicial process is examined including methodological approaches used, typical findings, and "behavioralists'" rejection of the case method of studying law. The author concludes that the behavioral approach to the study of judicial politics has not been substantially productive. (JT)

  9. Mechanics of cantilever beam: Implementation and comparison of FEM and MLPG approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trobec, Roman

    2016-06-08

    Two weak form solution approaches for partial differential equations, the well known meshbased finite element method and the newer meshless local Petrov Galerkin method are described and compared on a standard test case - mechanics of cantilever beam. The implementation, solution accuracy and calculation complexity are addressed for both approaches. We found out that FEM is superior in most standard criteria, but MLPG has some advantages because of its flexibility that results from its general formulation.

  10. Global Search Capabilities of Indirect Methods for Impulsive Transfers

    NASA Astrophysics Data System (ADS)

    Shen, Hong-Xin; Casalino, Lorenzo; Luo, Ya-Zhong

    2015-09-01

    An optimization method which combines an indirect method with homotopic approach is proposed and applied to impulsive trajectories. Minimum-fuel, multiple-impulse solutions, with either fixed or open time are obtained. The homotopic approach at hand is relatively straightforward to implement and does not require an initial guess of adjoints, unlike previous adjoints estimation methods. A multiple-revolution Lambert solver is used to find multiple starting solutions for the homotopic procedure; this approach can guarantee to obtain multiple local solutions without relying on the user's intuition, thus efficiently exploring the solution space to find the global optimum. The indirect/homotopic approach proves to be quite effective and efficient in finding optimal solutions, and outperforms the joint use of evolutionary algorithms and deterministic methods in the test cases.

  11. A Reconstruction Approach to High-Order Schemes Including Discontinuous Galerkin for Diffusion

    NASA Technical Reports Server (NTRS)

    Huynh, H. T.

    2009-01-01

    We introduce a new approach to high-order accuracy for the numerical solution of diffusion problems by solving the equations in differential form using a reconstruction technique. The approach has the advantages of simplicity and economy. It results in several new high-order methods including a simplified version of discontinuous Galerkin (DG). It also leads to new definitions of common value and common gradient quantities at each interface shared by the two adjacent cells. In addition, the new approach clarifies the relations among the various choices of new and existing common quantities. Fourier stability and accuracy analyses are carried out for the resulting schemes. Extensions to the case of quadrilateral meshes are obtained via tensor products. For the two-point boundary value problem (steady state), it is shown that these schemes, which include most popular DG methods, yield exact common interface quantities as well as exact cell average solutions for nearly all cases.

  12. Speckle: tool for diagnosis assistance

    NASA Astrophysics Data System (ADS)

    Carvalho, O.; Guyot, S.; Roy, L.; Benderitter, M.; Clairac, B.

    2006-09-01

    In this paper, we present a new approach of the speckle phenomenon. This method is based on the fractal Brownian motion theory and allows the extraction of three stochastic parameters to characterize the speckle pattern. For the first time, we present the results of this method applied to the discrimination of the healthy vs. pathologic skin. We also demonstrate, in case of the scleroderma, than this method is more accurate than the classical frequential approach.

  13. Validity of using ad hoc methods to analyze secondary traits in case-control association studies.

    PubMed

    Yung, Godwin; Lin, Xihong

    2016-12-01

    Case-control association studies often collect from their subjects information on secondary phenotypes. Reusing the data and studying the association between genes and secondary phenotypes provide an attractive and cost-effective approach that can lead to discovery of new genetic associations. A number of approaches have been proposed, including simple and computationally efficient ad hoc methods that ignore ascertainment or stratify on case-control status. Justification for these approaches relies on the assumption of no covariates and the correct specification of the primary disease model as a logistic model. Both might not be true in practice, for example, in the presence of population stratification or the primary disease model following a probit model. In this paper, we investigate the validity of ad hoc methods in the presence of covariates and possible disease model misspecification. We show that in taking an ad hoc approach, it may be desirable to include covariates that affect the primary disease in the secondary phenotype model, even though these covariates are not necessarily associated with the secondary phenotype. We also show that when the disease is rare, ad hoc methods can lead to severely biased estimation and inference if the true disease model follows a probit model instead of a logistic model. Our results are justified theoretically and via simulations. Applied to real data analysis of genetic associations with cigarette smoking, ad hoc methods collectively identified as highly significant (P<10-5) single nucleotide polymorphisms from over 10 genes, genes that were identified in previous studies of smoking cessation. © 2016 WILEY PERIODICALS, INC.

  14. Bound state solution of Dirac equation for 3D harmonics oscillator plus trigonometric scarf noncentral potential using SUSY QM approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cari, C., E-mail: carinln@yahoo.com; Suparmi, A., E-mail: carinln@yahoo.com

    2014-09-30

    Dirac equation of 3D harmonics oscillator plus trigonometric Scarf non-central potential for spin symmetric case is solved using supersymmetric quantum mechanics approach. The Dirac equation for exact spin symmetry reduces to Schrodinger like equation. The relativistic energy and wave function for spin symmetric case are simply obtained using SUSY quantum mechanics method and idea of shape invariance.

  15. Exponentially Stabilizing Robot Control Laws

    NASA Technical Reports Server (NTRS)

    Wen, John T.; Bayard, David S.

    1990-01-01

    New class of exponentially stabilizing laws for joint-level control of robotic manipulators introduced. In case of set-point control, approach offers simplicity of proportion/derivative control architecture. In case of tracking control, approach provides several important alternatives to completed-torque method, as far as computational requirements and convergence. New control laws modified in simple fashion to obtain asymptotically stable adaptive control, when robot model and/or payload mass properties unknown.

  16. A Comparison of Case Study and Traditional Teaching Methods for Improvement of Oral Communication and Critical-Thinking Skills

    ERIC Educational Resources Information Center

    Noblitt, Lynnette; Vance, Diane E.; Smith, Michelle L. DePoy

    2010-01-01

    This study compares a traditional paper presentation approach and a case study method for the development and improvement of oral communication skills and critical-thinking skills in a class of junior forensic science majors. A rubric for rating performance in these skills was designed on the basis of the oral communication competencies developed…

  17. A Generalized Least Squares Regression Approach for Computing Effect Sizes in Single-Case Research: Application Examples

    ERIC Educational Resources Information Center

    Maggin, Daniel M.; Swaminathan, Hariharan; Rogers, Helen J.; O'Keeffe, Breda V.; Sugai, George; Horner, Robert H.

    2011-01-01

    A new method for deriving effect sizes from single-case designs is proposed. The strategy is applicable to small-sample time-series data with autoregressive errors. The method uses Generalized Least Squares (GLS) to model the autocorrelation of the data and estimate regression parameters to produce an effect size that represents the magnitude of…

  18. Walking the Razor's Edge: Risks and Rewards for Students and Faculty Adopting Case in Point Teaching and Learning Approaches

    ERIC Educational Resources Information Center

    Wildermuth, Cristina de-Mello-e-Souza; Smith-Bright, Elaine; Noll-Wilson, Sarah; Fink, Alex

    2015-01-01

    Case in Point (CIP) is an interactive leadership development method pioneered by Ronald Heifetz. CIP instructors follow a fluid class structure, in which group dynamics and student concerns become catalysts for learning. CIP proponents defend the method's potential to help students experience real life leadership challenges. To date, however, very…

  19. Integrating Qualitative and Quantitative Research in Organizations.

    DTIC Science & Technology

    1981-07-01

    Qualitative Researcher Qualitative research using the traditional case study was the most popular method during the early empirical investigations of...what is now known as qualitative methods (Van Maanen, 1979). Some researchers have recently argued that restricting case studies to exploratory work... phenomenological approaches at the subjective end of the continuum. A few researchers have suggested ways in which quantitative and

  20. Case Study: A Picture Worth a Thousand Words? Making a Case for Video Case Studies

    ERIC Educational Resources Information Center

    Pai, Aditi

    2014-01-01

    A picture, they say, is worth a thousand words. If a mere picture is worth a thousand words, how much more are "moving pictures" or videos worth? The author poses this not merely as a rhetorical question, but because she wishes to make a case for using videos in the traditional case study method. She recommends four main approaches of…

  1. The Assessment of Math Learning Difficulties in a Primary Grade-4 Child with High Support Needs: Mixed Methods Approach

    ERIC Educational Resources Information Center

    Mundia, Lawrence

    2012-01-01

    This mixed-methods study incorporated elements of survey, case study and action research approaches in investigating an at-risk child. Using an in-take interview, a diagnostic test, an error analysis, and a think-aloud clinical interview, the study identified the child's major presenting difficulties. These included: inability to use the four…

  2. Online market research panel members as controls in case-control studies to investigate gastrointestinal disease outbreaks: early experiences and lessons learnt from the UK.

    PubMed

    Mook, P; McCormick, J; Kanagarajah, S; Adak, G K; Cleary, P; Elson, R; Gobin, M; Hawker, J; Inns, T; Sinclair, C; Trienekens, S C M; Vivancos, R; McCarthy, N D

    2018-03-01

    Established methods of recruiting population controls for case-control studies to investigate gastrointestinal disease outbreaks can be time consuming, resulting in delays in identifying the source or vehicle of infection. After an initial evaluation of using online market research panel members as controls in a case-control study to investigate a Salmonella outbreak in 2013, this method was applied in four further studies in the UK between 2014 and 2016. We used data from all five studies and interviews with members of each outbreak control team and market research panel provider to review operational issues, evaluate risk of bias in this approach and consider methods to reduce confounding and bias. The investigators of each outbreak reported likely time and cost savings from using market research controls. There were systematic differences between case and control groups in some studies but no evidence that conclusions on the likely source or vehicle of infection were incorrect. Potential selection biases introduced by using this sampling frame and the low response rate are unclear. Methods that might reduce confounding and some bias should be balanced with concerns for overmatching. Further evaluation of this approach using comparisons with traditional methods and population-based exposure survey data is recommended.

  3. Determination Plastic Properties of a Material by Spherical Indentation Base on the Representative Stress Approach

    NASA Astrophysics Data System (ADS)

    Budiarsa, I. N.; Gde Antara, I. N.; Dharma, Agus; Karnata, I. N.

    2018-04-01

    Under an indentation, the material undergoes a complex deformation. One of the most effective ways to analyse indentation has been the representative method. The concept coupled with finite element (FE) modelling has been used successfully in analysing sharp indenters. It is of great importance to extend this method to spherical indentation and associated hardness system. One particular case is the Rockwell B test, where the hardness is determined by two points on the P-h curve of a spherical indenter. In this case, an established link between materials parameters and P-h curves can naturally lead to direct hardness estimation from the materials parameters (e.g. yield stress (y) and work hardening coefficients (n)). This could provide a useful tool for both research and industrial applications. Two method to predict p-h curve in spherical indentation has been established. One is use method using C1-C2 polynomial equation approach and another one by depth approach. Both approach has been successfully. An effective method in representing the P-h curves using a normalized representative stress concept was established. The concept and methodology developed is used to predict hardness (HRB) values of materials through direct analysis and validated with experimental data on selected samples of steel.

  4. CASE STUDY RESEARCH: THE VIEW FROM COMPLEXITY SCIENCE

    PubMed Central

    Anderson, Ruth; Crabtree, Benjamin F.; Steele, David J.; McDaniel, Reuben R.

    2005-01-01

    Many wonder why there has been so little change in care quality, despite substantial quality improvement efforts. Questioning why current approaches are not making true changes draws attention to the organization as a source of answers. We bring together the case study method and complexity science to suggest new ways to study health care organizations. The case study provides a method for studying systems. Complexity theory suggests that keys to understanding the system are contained in patterns of relationships and interactions among the system’s agents. We propose some of the “objects” of study that are implicated by complexity theory and discuss how studying these using case methods may provide useful maps of the system. We offer complexity theory, partnered with case study method, as a place to begin the daunting task of studying a system as an integrated whole. PMID:15802542

  5. The use of scenarios for long-range planning by investor-owned electric utilities in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Lyons, John V.

    Scenario planning is a method of organizing and understanding large amounts of quantitative and qualitative data for leaders to make better strategic decisions. There is a lack of academic research about scenario planning with a subsequent shortage of definitions and theories. This study utilized a case study methodology to analyze scenario planning by investor-owned electric utilities in the Pacific Northwest in their integrated resource planning (IRP) process. The cases include Avista Corporation, Idaho Power, PacifiCorp, Portland General Electric, and Puget Sound Energy. This study sought to determine how scenario planning was used, what scenario approach was used, the scenario outcomes, and the similarities and differences in the scenario planning processes. The literature review of this study covered the development of scenario planning, common definitions and theories, approaches to scenario development, and scenario outcomes. A research methodology was developed to classify the scenario development approach into intuitive, hybrid, or quantitative approaches; and scenario outcomes of changed thinking, stories of plausible futures, improved decision making, and enhanced organizational learning. The study found all three forms of scenario planning in the IRPs. All of the cases used a similar approach to IRP development. All of the cases had at least improved decision making as an outcome of scenario planning. Only one case demonstrated all four scenario outcomes. A critical finding was a correlation between the use of the intuitive approach and the use of all scenario outcomes. Another major finding was the unique use of predetermined elements, which are normally consistent across scenarios, but became critical uncertainties in some of the scenarios in the cases for this study. This finding will need to be confirmed by future research as unique to the industry or an aberration. An unusually high number of scenarios were found for cases using the hybrid approach, which was unexpected based on the literature. This work expanded the methods for studying scenario planning, enhanced the body of scholarly works on scenario planning, and provided a starting point for additional research concerning the use of scenario planning by electric utilities.

  6. Consistency of extreme flood estimation approaches

    NASA Astrophysics Data System (ADS)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  7. A case-association cluster detection and visualisation tool with an application to Legionnaires’ disease

    PubMed Central

    Sansom, P; Copley, V R; Naik, F C; Leach, S; Hall, I M

    2013-01-01

    Statistical methods used in spatio-temporal surveillance of disease are able to identify abnormal clusters of cases but typically do not provide a measure of the degree of association between one case and another. Such a measure would facilitate the assignment of cases to common groups and be useful in outbreak investigations of diseases that potentially share the same source. This paper presents a model-based approach, which on the basis of available location data, provides a measure of the strength of association between cases in space and time and which is used to designate and visualise the most likely groupings of cases. The method was developed as a prospective surveillance tool to signal potential outbreaks, but it may also be used to explore groupings of cases in outbreak investigations. We demonstrate the method by using a historical case series of Legionnaires’ disease amongst residents of England and Wales. PMID:23483594

  8. A Tensor-Based Structural Damage Identification and Severity Assessment

    PubMed Central

    Anaissi, Ali; Makki Alamdari, Mehrisadat; Rakotoarivelo, Thierry; Khoa, Nguyen Lu Dang

    2018-01-01

    Early damage detection is critical for a large set of global ageing infrastructure. Structural Health Monitoring systems provide a sensor-based quantitative and objective approach to continuously monitor these structures, as opposed to traditional engineering visual inspection. Analysing these sensed data is one of the major Structural Health Monitoring (SHM) challenges. This paper presents a novel algorithm to detect and assess damage in structures such as bridges. This method applies tensor analysis for data fusion and feature extraction, and further uses one-class support vector machine on this feature to detect anomalies, i.e., structural damage. To evaluate this approach, we collected acceleration data from a sensor-based SHM system, which we deployed on a real bridge and on a laboratory specimen. The results show that our tensor method outperforms a state-of-the-art approach using the wavelet energy spectrum of the measured data. In the specimen case, our approach succeeded in detecting 92.5% of induced damage cases, as opposed to 61.1% for the wavelet-based approach. While our method was applied to bridges, its algorithm and computation can be used on other structures or sensor-data analysis problems, which involve large series of correlated data from multiple sensors. PMID:29301314

  9. Analysis and 3D reconstruction of heterogeneity in malignant brain tumors: an interdisciplinary case study using a novel computational visualization approach.

    PubMed

    Mojsilovic, Aleksandra; Rogowitz, Bernice; Gomes, Jose; Deisboeck, Thomas S

    2002-06-01

    To explore how a multidisciplinary approach, combining modern visualization and image processing techniques with innovative experimental studies, can augment the understanding of tumor development. We analyzed histologic sections of a microscopic brain tumor and reconstructed these slices into a 3D representation. We processed these slices to: (1) identify tumor boundaries, (2) isolate proliferating tumor cells, and (3) segment the tumor into regions based on the density of proliferating cells. We then reconstructed the 3D shape of the tumor using a constrained deformable surface approach. This novel method allows the analyst to (1) see specific properties of histologic slices in the 3D environment with animation, (2) switch 2D "views" dynamically, and (3) see relationships between the 3D structure and structure on a plane. Using this method to analyze a specific "case," we were also able to shed light on the limitations of a widely held assumption about the shape of expanding microscopic solid tumors as well as find more indications that such tumors behave as adaptive biosystems. Implications of these case study results, as well as future applications of the method for tumor biology research, are discussed.

  10. Ontology-Based Method for Fault Diagnosis of Loaders.

    PubMed

    Xu, Feixiang; Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-02-28

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study.

  11. Ontology-Based Method for Fault Diagnosis of Loaders

    PubMed Central

    Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-01-01

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study. PMID:29495646

  12. TRASYS form factor matrix normalization

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.

    1992-01-01

    A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.

  13. Automated Peak Detection and Matching Algorithm for Gas Chromatography–Differential Mobility Spectrometry

    PubMed Central

    Fong, Sim S.; Rearden, Preshious; Kanchagar, Chitra; Sassetti, Christopher; Trevejo, Jose; Brereton, Richard G.

    2013-01-01

    A gas chromatography–differential mobility spectrometer (GC-DMS) involves a portable and selective mass analyzer that may be applied to chemical detection in the field. Existing approaches examine whole profiles and do not attempt to resolve peaks. A new approach for peak detection in the 2D GC-DMS chromatograms is reported. This method is demonstrated on three case studies: a simulated case study; a case study of headspace gas analysis of Mycobacterium tuberculosis (MTb) cultures consisting of three matching GC-DMS and GC-MS chromatograms; a case study consisting of 41 GC-DMS chromatograms of headspace gas analysis of MTb culture and media. PMID:21204557

  14. Examining Preservice Teachers' Decision Behaviors and Individual Differences in Three Online Case-Based Approaches

    ERIC Educational Resources Information Center

    Cevik, Yasemin Demiraslan; Andre, Thomas

    2013-01-01

    This study compared the impact of three types of case-based methods (case-based reasoning, worked example, and faded worked example) on preservice teachers' (n = 71) interaction with decision tasks and whether decision related measures (task difficulty, mental effort, decision making performance) were associated with the differences in student…

  15. Developing, Implementing and Evaluating Case Studies in Materials Science

    ERIC Educational Resources Information Center

    Davis, Claire; Wilcock, Elizabeth

    2005-01-01

    The use of case studies to teach materials science undergraduates is an exciting and interesting educational approach. As well as helping learners to connect theory and practice, the case method is also useful for creating an active learning environment, developing key skills and catering for a range of different learning styles. This paper…

  16. A Case-Based Curriculum for Introductory Geology

    ERIC Educational Resources Information Center

    Goldsmith, David W.

    2011-01-01

    For the past 5 years I have been teaching my introductory geology class using a case-based method that promotes student engagement and inquiry. This article presents an explanation of how a case-based curriculum differs from a more traditional approach to the material. It also presents a statistical analysis of several years' worth of student…

  17. Approaches to Child Protection Case Management for Cases Involving People with Disabilities

    ERIC Educational Resources Information Center

    Lightfoot, Elizabeth B.; LaLiberte, Traci L.

    2006-01-01

    Objectives: This exploratory study examines the delivery of child protection services by county child protection agencies involving cases with a family member with a disability. Method: Telephone surveys were conducted with the directors or their designees of 89% of the child protection agencies in a Midwestern state. Respondents were asked about…

  18. Complexity, Representation and Practice: Case Study as Method and Methodology

    ERIC Educational Resources Information Center

    Miles, Rebecca

    2015-01-01

    While case study is considered a common approach to examining specific and particular examples in research disciplines such as law, medicine and psychology, in the social sciences case study is often treated as a lesser, flawed or undemanding methodology which is less valid, reliable or theoretically rigorous than other methodologies. Building on…

  19. Analytic approximations of Von Kármán plate under arbitrary uniform pressure—equations in integral form

    NASA Astrophysics Data System (ADS)

    Zhong, XiaoXu; Liao, ShiJun

    2018-01-01

    Analytic approximations of the Von Kármán's plate equations in integral form for a circular plate under external uniform pressure to arbitrary magnitude are successfully obtained by means of the homotopy analysis method (HAM), an analytic approximation technique for highly nonlinear problems. Two HAM-based approaches are proposed for either a given external uniform pressure Q or a given central deflection, respectively. Both of them are valid for uniform pressure to arbitrary magnitude by choosing proper values of the so-called convergence-control parameters c 1 and c 2 in the frame of the HAM. Besides, it is found that the HAM-based iteration approaches generally converge much faster than the interpolation iterative method. Furthermore, we prove that the interpolation iterative method is a special case of the first-order HAM iteration approach for a given external uniform pressure Q when c 1 = - θ and c 2 = -1, where θ denotes the interpolation iterative parameter. Therefore, according to the convergence theorem of Zheng and Zhou about the interpolation iterative method, the HAM-based approaches are valid for uniform pressure to arbitrary magnitude at least in the special case c 1 = - θ and c 2 = -1. In addition, we prove that the HAM approach for the Von Kármán's plate equations in differential form is just a special case of the HAM for the Von Kármán's plate equations in integral form mentioned in this paper. All of these illustrate the validity and great potential of the HAM for highly nonlinear problems, and its superiority over perturbation techniques.

  20. Service-Learning in Communication Education: A Case Study Investigation in Support of a Prisoners' Human Rights Organization

    ERIC Educational Resources Information Center

    Novek, Eleanor

    2009-01-01

    This article offers a case study of a graduate class in communication research methods with a service-learning approach. Students were engaged in evaluating the public information campaign of a nonprofit organization exposing human rights abuses in US prisons. They gained hands-on experience in the use of a variety of basic research methods and…

  1. Teaching Business Demography Using Case Studies.

    PubMed

    Swanson, David A; Morrison, Peter A

    2010-02-01

    Many faculty members consider using case studies but not all end up using them. We provide a brief review of what cases are intended to do and identify three ways in which they can be used. We then use an example to illustrate how we have used the case study method in teaching business demography. Among other benefits, we note that the case studies method not only encourages the acquisition of skills by students, but can be used to promote "deep structure learning," an approach naturally accommodates other features associated with the case studies method-the development of critical thinking skills, the use of real world problems, the emphasis of concepts over mechanics, writing and presentation skills, active cooperative learning and the "worthwhileness" of a course. As noted by others, we understand the limitations of the case study method. However, given its strengths, we believe it has a place in the instructional toolbox for courses in business demography. The fact that courses we teach is a testament to our perceived efficacy of this tool.

  2. Improving the performance of lesion-based computer-aided detection schemes of breast masses using a case-based adaptive cueing method

    NASA Astrophysics Data System (ADS)

    Tan, Maxine; Aghaei, Faranak; Wang, Yunzhi; Qian, Wei; Zheng, Bin

    2016-03-01

    Current commercialized CAD schemes have high false-positive (FP) detection rates and also have high correlations in positive lesion detection with radiologists. Thus, we recently investigated a new approach to improve the efficacy of applying CAD to assist radiologists in reading and interpreting screening mammograms. Namely, we developed a new global feature based CAD approach/scheme that can cue the warning sign on the cases with high risk of being positive. In this study, we investigate the possibility of fusing global feature or case-based scores with the local or lesion-based CAD scores using an adaptive cueing method. We hypothesize that the information from the global feature extraction (features extracted from the whole breast regions) are different from and can provide supplementary information to the locally-extracted features (computed from the segmented lesion regions only). On a large and diverse full-field digital mammography (FFDM) testing dataset with 785 cases (347 negative and 438 cancer cases with masses only), we ran our lesion-based and case-based CAD schemes "as is" on the whole dataset. To assess the supplementary information provided by the global features, we used an adaptive cueing method to adaptively adjust the original CAD-generated detection scores (Sorg) of a detected suspicious mass region based on the computed case-based score (Scase) of the case associated with this detected region. Using the adaptive cueing method, better sensitivity results were obtained at lower FP rates (<= 1 FP per image). Namely, increases of sensitivities (in the FROC curves) of up to 6.7% and 8.2% were obtained for the ROI and Case-based results, respectively.

  3. Outcomes of Interventions Via a Transradial Approach for Dysfunctional Brescia-Cimino Fistulas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen Shyhming; Hang Chiling; Yip Honkan

    2009-09-15

    The transradial artery approach to angioplasty has rarely been reported as a method for treating dysfunctional Brescia-Cimino fistulas. This study evaluated the feasibility, safety, and 1-year efficacy of this method for treating dysfunctional Brescia-Cimino fistulas. We retrospectively evaluated 154 consecutive procedures in 131 patients (age, 58.3 {+-} 11.6 years; male, 48.1%) who underwent the transradial approach in dysfunctional Brescia-Cimino fistulas in the 1-year period after the procedure. The operator determined the use of a regular or a cutting balloon (two cases) in combination with urokinase injection (one case) or catheter thromboaspiration. Radial artery puncture was successful in all cases. Fifty-twomore » cases (33.8%) had totally occluded fibrotic lesions. The overall anatomic success rate and clinical success rate were 61% (94/154) and 81.1% (125/154), respectively. In cases with a totally occluded fibrotic lesion, the clinical success rate was 46%. Successful intervention was associated with a significant reduction in the radial arterial systolic and diastolic pressures. There were no complications of symptomatic arterial embolization or pulmonary embolism, and one complication of venous rupture was successfully treated by compression. The primary patency rates based on intention-to-treat were 75.3% at 30 days and 39.0% at 1 year after the procedure. Excluding the cases with a totally occluded lesion, the clinical success rate was 99% (101/102) and the primary patency rates were 84.3% (86/102) and 52.0% (53/102) at 3 months and 1 year after the procedure, respectively. In conclusion, the transradial approach is a feasible, safe, and effective alternative for catheter intervention for dysfunctional Brescia-Cimino fistulas. Its success rate in cases with a totally occluded fibrotic lesion is unsatisfactory.« less

  4. A Hybrid One-Way ANOVA Approach for the Robust and Efficient Estimation of Differential Gene Expression with Multiple Patterns

    PubMed Central

    Mollah, Mohammad Manir Hossain; Jamal, Rahman; Mokhtar, Norfilza Mohd; Harun, Roslan; Mollah, Md. Nurul Haque

    2015-01-01

    Background Identifying genes that are differentially expressed (DE) between two or more conditions with multiple patterns of expression is one of the primary objectives of gene expression data analysis. Several statistical approaches, including one-way analysis of variance (ANOVA), are used to identify DE genes. However, most of these methods provide misleading results for two or more conditions with multiple patterns of expression in the presence of outlying genes. In this paper, an attempt is made to develop a hybrid one-way ANOVA approach that unifies the robustness and efficiency of estimation using the minimum β-divergence method to overcome some problems that arise in the existing robust methods for both small- and large-sample cases with multiple patterns of expression. Results The proposed method relies on a β-weight function, which produces values between 0 and 1. The β-weight function with β = 0.2 is used as a measure of outlier detection. It assigns smaller weights (≥ 0) to outlying expressions and larger weights (≤ 1) to typical expressions. The distribution of the β-weights is used to calculate the cut-off point, which is compared to the observed β-weight of an expression to determine whether that gene expression is an outlier. This weight function plays a key role in unifying the robustness and efficiency of estimation in one-way ANOVA. Conclusion Analyses of simulated gene expression profiles revealed that all eight methods (ANOVA, SAM, LIMMA, EBarrays, eLNN, KW, robust BetaEB and proposed) perform almost identically for m = 2 conditions in the absence of outliers. However, the robust BetaEB method and the proposed method exhibited considerably better performance than the other six methods in the presence of outliers. In this case, the BetaEB method exhibited slightly better performance than the proposed method for the small-sample cases, but the the proposed method exhibited much better performance than the BetaEB method for both the small- and large-sample cases in the presence of more than 50% outlying genes. The proposed method also exhibited better performance than the other methods for m > 2 conditions with multiple patterns of expression, where the BetaEB was not extended for this condition. Therefore, the proposed approach would be more suitable and reliable on average for the identification of DE genes between two or more conditions with multiple patterns of expression. PMID:26413858

  5. Alignment of sensor arrays in optical instruments using a geometric approach.

    PubMed

    Sawyer, Travis W

    2018-02-01

    Alignment of sensor arrays in optical instruments is critical to maximize the instrument's performance. While many commercial systems use standardized mounting threads for alignment, custom systems require specialized equipment and alignment procedures. These alignment procedures can be time-consuming, dependent on operator experience, and have low repeatability. Furthermore, each alignment solution must be considered on a case-by-case basis, leading to additional time and resource cost. Here I present a method to align a sensor array using geometric analysis. By imaging a grid pattern of dots, I show that it is possible to calculate the misalignment for a sensor in five degrees of freedom simultaneously. I first test the approach by simulating different cases of misalignment using Zemax before applying the method to experimentally acquired data of sensor misalignment for an echelle spectrograph. The results show that the algorithm effectively quantifies misalignment in five degrees of freedom for an F/5 imaging system, accurate to within ±0.87  deg in rotation and ±0.86  μm in translation. Furthermore, the results suggest that the method can also be applied to non-imaging systems with a small penalty to precision. This general approach can potentially improve the alignment of sensor arrays in custom instruments by offering an accurate, quantitative approach to calculating misalignment in five degrees of freedom simultaneously.

  6. A Rawlsian approach to distribute responsibilities in networks.

    PubMed

    Doorn, Neelke

    2010-06-01

    Due to their non-hierarchical structure, socio-technical networks are prone to the occurrence of the problem of many hands. In the present paper an approach is introduced in which people's opinions on responsibility are empirically traced. The approach is based on the Rawlsian concept of Wide Reflective Equilibrium (WRE) in which people's considered judgments on a case are reflectively weighed against moral principles and background theories, ideally leading to a state of equilibrium. Application of the method to a hypothetical case with an artificially constructed network showed that it is possible to uncover the relevant data to assess a consensus amongst people in terms of their individual WRE. It appeared that the moral background theories people endorse are not predictive for their actual distribution of responsibilities but that they indicate ways of reasoning and justifying outcomes. Two ways of ascribing responsibilities were discerned, corresponding to two requirements of a desirable responsibility distribution: fairness and completeness. Applying the method triggered learning effects, both with regard to conceptual clarification and moral considerations, and in the sense that it led to some convergence of opinions. It is recommended to apply the method to a real engineering case in order to see whether this approach leads to an overlapping consensus on a responsibility distribution which is justifiable to all and in which no responsibilities are left unfulfilled, therewith trying to contribute to the solution of the problem of many hands.

  7. Evaluating the Impact of the U.S. National Toxicology Program: A Case Study on Hexavalent Chromium.

    PubMed

    Xie, Yun; Holmgren, Stephanie; Andrews, Danica M K; Wolfe, Mary S

    2017-02-01

    Evaluating the impact of federally funded research with a broad, methodical, and objective approach is important to ensure that public funds advance the mission of federal agencies. We aimed to develop a methodical approach that would yield a broad assessment of National Toxicology Program's (NTP's) effectiveness across multiple sectors and demonstrate the utility of the approach through a case study. A conceptual model was developed with defined activities, outputs (products), and outcomes (proximal, intermediate, distal) and applied retrospectively to NTP's research on hexavalent chromium (CrVI). Proximal outcomes were measured by counting views of and requests for NTP's products by external stakeholders. Intermediate outcomes were measured by bibliometric analysis. Distal outcomes were assessed through Web and LexisNexis searches for documents related to legislation or regulation changes. The approach identified awareness of NTP's work on CrVI by external stakeholders (proximal outcome) and citations of NTP's research in scientific publications, reports, congressional testimonies, and legal and policy documents (intermediate outcome). NTP's research was key to the nation's first-ever drinking water standard for CrVI adopted by California in 2014 (distal outcome). By applying this approach to a case study, the utility and limitations of the approach were identified, including challenges to evaluating the outcomes of a research program. This study identified a broad and objective approach for assessing NTP's effectiveness, including methodological needs for more thorough and efficient impact assessments in the future. Citation: Xie Y, Holmgren S, Andrews DMK, Wolfe MS. 2017. Evaluating the impact of the U.S. National Toxicology Program: a case study on hexavalent chromium. Environ Health Perspect 125:181-188; http://dx.doi.org/10.1289/EHP21.

  8. A cut-cell finite volume – finite element coupling approach for fluid–structure interaction in compressible flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasquariello, Vito, E-mail: vito.pasquariello@tum.de; Hammerl, Georg; Örley, Felix

    2016-02-15

    We present a loosely coupled approach for the solution of fluid–structure interaction problems between a compressible flow and a deformable structure. The method is based on staggered Dirichlet–Neumann partitioning. The interface motion in the Eulerian frame is accounted for by a conservative cut-cell Immersed Boundary method. The present approach enables sub-cell resolution by considering individual cut-elements within a single fluid cell, which guarantees an accurate representation of the time-varying solid interface. The cut-cell procedure inevitably leads to non-matching interfaces, demanding for a special treatment. A Mortar method is chosen in order to obtain a conservative and consistent load transfer. Wemore » validate our method by investigating two-dimensional test cases comprising a shock-loaded rigid cylinder and a deformable panel. Moreover, the aeroelastic instability of a thin plate structure is studied with a focus on the prediction of flutter onset. Finally, we propose a three-dimensional fluid–structure interaction test case of a flexible inflated thin shell interacting with a shock wave involving large and complex structural deformations.« less

  9. An adaptive multi-moment FVM approach for incompressible flows

    NASA Astrophysics Data System (ADS)

    Liu, Cheng; Hu, Changhong

    2018-04-01

    In this study, a multi-moment finite volume method (FVM) based on block-structured adaptive Cartesian mesh is proposed for simulating incompressible flows. A conservative interpolation scheme following the idea of the constrained interpolation profile (CIP) method is proposed for the prolongation operation of the newly created mesh. A sharp immersed boundary (IB) method is used to model the immersed rigid body. A moving least squares (MLS) interpolation approach is applied for reconstruction of the velocity field around the solid surface. An efficient method for discretization of Laplacian operators on adaptive meshes is proposed. Numerical simulations on several test cases are carried out for validation of the proposed method. For the case of viscous flow past an impulsively started cylinder (Re = 3000 , 9500), the computed surface vorticity coincides with the result of the body-fitted method. For the case of a fast pitching NACA 0015 airfoil at moderate Reynolds numbers (Re = 10000 , 45000), the predicted drag coefficient (CD) and lift coefficient (CL) agree well with other numerical or experimental results. For 2D and 3D simulations of viscous flow past a pitching plate with prescribed motions (Re = 5000 , 40000), the predicted CD, CL and CM (moment coefficient) are in good agreement with those obtained by other numerical methods.

  10. Supplementary search methods were more effective and offered better value than bibliographic database searching: A case study from public health and environmental enhancement.

    PubMed

    Cooper, Chris; Lovell, Rebecca; Husk, Kerryn; Booth, Andrew; Garside, Ruth

    2018-06-01

    We undertook a systematic review to evaluate the health benefits of environmental enhancement and conservation activities. We were concerned that a conventional process of study identification, focusing on exhaustive searches of bibliographic databases as the primary search method, would be ineffective, offering limited value. The focus of this study is comparing study identification methods. We compare (1) an approach led by searches of bibliographic databases with (2) an approach led by supplementary search methods. We retrospectively assessed the effectiveness and value of both approaches. Effectiveness was determined by comparing (1) the total number of studies identified and screened and (2) the number of includable studies uniquely identified by each approach. Value was determined by comparing included study quality and by using qualitative sensitivity analysis to explore the contribution of studies to the synthesis. The bibliographic databases approach identified 21 409 studies to screen and 2 included qualitative studies were uniquely identified. Study quality was moderate, and contribution to the synthesis was minimal. The supplementary search approach identified 453 studies to screen and 9 included studies were uniquely identified. Four quantitative studies were poor quality but made a substantive contribution to the synthesis; 5 studies were qualitative: 3 studies were good quality, one was moderate quality, and 1 study was excluded from the synthesis due to poor quality. All 4 included qualitative studies made significant contributions to the synthesis. This case study found value in aligning primary methods of study identification to maximise location of relevant evidence. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Efficient strategy for detecting gene × gene joint action and its application in schizophrenia.

    PubMed

    Won, Sungho; Kwon, Min-Seok; Mattheisen, Manuel; Park, Suyeon; Park, Changsoon; Kihara, Daisuke; Cichon, Sven; Ophoff, Roel; Nöthen, Markus M; Rietschel, Marcella; Baur, Max; Uitterlinden, Andre G; Hofmann, A; Lange, Christoph

    2014-01-01

    We propose a new approach to detect gene × gene joint action in genome-wide association studies (GWASs) for case-control designs. This approach offers an exhaustive search for all two-way joint action (including, as a special case, single gene action) that is computationally feasible at the genome-wide level and has reasonable statistical power under most genetic models. We found that the presence of any gene × gene joint action may imply differences in three types of genetic components: the minor allele frequencies and the amounts of Hardy-Weinberg disequilibrium may differ between cases and controls, and between the two genetic loci the degree of linkage disequilibrium may differ between cases and controls. Using Fisher's method, it is possible to combine the different sources of genetic information in an overall test for detecting gene × gene joint action. The proposed statistical analysis is efficient and its simplicity makes it applicable to GWASs. In the current study, we applied the proposed approach to a GWAS on schizophrenia and found several potential gene × gene interactions. Our application illustrates the practical advantage of the proposed method. © 2013 WILEY PERIODICALS, INC.

  12. Convergence of methods for coupling of microscopic and mesoscopic reaction-diffusion simulations

    NASA Astrophysics Data System (ADS)

    Flegg, Mark B.; Hellander, Stefan; Erban, Radek

    2015-05-01

    In this paper, three multiscale methods for coupling of mesoscopic (compartment-based) and microscopic (molecular-based) stochastic reaction-diffusion simulations are investigated. Two of the three methods that will be discussed in detail have been previously reported in the literature; the two-regime method (TRM) and the compartment-placement method (CPM). The third method that is introduced and analysed in this paper is called the ghost cell method (GCM), since it works by constructing a "ghost cell" in which molecules can disappear and jump into the compartment-based simulation. Presented is a comparison of sources of error. The convergent properties of this error are studied as the time step Δt (for updating the molecular-based part of the model) approaches zero. It is found that the error behaviour depends on another fundamental computational parameter h, the compartment size in the mesoscopic part of the model. Two important limiting cases, which appear in applications, are considered: Δt → 0 and h is fixed; Δt → 0 and h → 0 such that √{ Δt } / h is fixed. The error for previously developed approaches (the TRM and CPM) converges to zero only in the limiting case (ii), but not in case (i). It is shown that the error of the GCM converges in the limiting case (i). Thus the GCM is superior to previous coupling techniques if the mesoscopic description is much coarser than the microscopic part of the model.

  13. Empirical evaluation of the market price of risk using the CIR model

    NASA Astrophysics Data System (ADS)

    Bernaschi, M.; Torosantucci, L.; Uboldi, A.

    2007-03-01

    We describe a simple but effective method for the estimation of the market price of risk. The basic idea is to compare the results obtained by following two different approaches in the application of the Cox-Ingersoll-Ross (CIR) model. In the first case, we apply the non-linear least squares method to cross sectional data (i.e., all rates of a single day). In the second case, we consider the short rate obtained by means of the first procedure as a proxy of the real market short rate. Starting from this new proxy, we evaluate the parameters of the CIR model by means of martingale estimation techniques. The estimate of the market price of risk is provided by comparing results obtained with these two techniques, since this approach makes possible to isolate the market price of risk and evaluate, under the Local Expectations Hypothesis, the risk premium given by the market for different maturities. As a test case, we apply the method to data of the European Fixed Income Market.

  14. Participation rates in the selection of population controls in a case-control study of colorectal cancer using two recruitment methods.

    PubMed

    Castaño-Vinyals, Gemma; Nieuwenhuijsen, Mark J; Moreno, Víctor; Carrasco, Estela; Guinó, Elisabet; Kogevinas, Manolis; Villanueva, Cristina M

    2011-01-01

    Low participation rates in the selection of population controls are an increasing concern for the validity of case-control studies worldwide. We conducted a pilot study to assess two approaches to recruiting population controls in a study of colorectal cancer, including a face-to-face interview and blood sample collection. In the first approach, persons identified through a population roster were invited to participate through a telephone call by an interviewer telephoning on behalf of our research center. In the second approach, individuals were identified from the lists of selected family practitioners and were telephoned on behalf of the family practitioner. When the second method was used, participation rates increased from 42% to 57% and the percentage of refusals decreased from 47% to 13%. The reasons for refusing to participate did not differ significantly between the two methods. Contact through the family practitioner yielded higher response rates in population controls in the study area. 2010 SESPAS. Published by Elsevier Espana. All rights reserved.

  15. Simplex-based optimization of numerical and categorical inputs in early bioprocess development: Case studies in HT chromatography.

    PubMed

    Konstantinidis, Spyridon; Titchener-Hooker, Nigel; Velayudhan, Ajoy

    2017-08-01

    Bioprocess development studies often involve the investigation of numerical and categorical inputs via the adoption of Design of Experiments (DoE) techniques. An attractive alternative is the deployment of a grid compatible Simplex variant which has been shown to yield optima rapidly and consistently. In this work, the method is combined with dummy variables and it is deployed in three case studies wherein spaces are comprised of both categorical and numerical inputs, a situation intractable by traditional Simplex methods. The first study employs in silico data and lays out the dummy variable methodology. The latter two employ experimental data from chromatography based studies performed with the filter-plate and miniature column High Throughput (HT) techniques. The solute of interest in the former case study was a monoclonal antibody whereas the latter dealt with the separation of a binary system of model proteins. The implemented approach prevented the stranding of the Simplex method at local optima, due to the arbitrary handling of the categorical inputs, and allowed for the concurrent optimization of numerical and categorical, multilevel and/or dichotomous, inputs. The deployment of the Simplex method, combined with dummy variables, was therefore entirely successful in identifying and characterizing global optima in all three case studies. The Simplex-based method was further shown to be of equivalent efficiency to a DoE-based approach, represented here by D-Optimal designs. Such an approach failed, however, to both capture trends and identify optima, and led to poor operating conditions. It is suggested that the Simplex-variant is suited to development activities involving numerical and categorical inputs in early bioprocess development. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Instanton approach to large N Harish-Chandra-Itzykson-Zuber integrals.

    PubMed

    Bun, J; Bouchaud, J P; Majumdar, S N; Potters, M

    2014-08-15

    We reconsider the large N asymptotics of Harish-Chandra-Itzykson-Zuber integrals. We provide, using Dyson's Brownian motion and the method of instantons, an alternative, transparent derivation of the Matytsin formalism for the unitary case. Our method is easily generalized to the orthogonal and symplectic ensembles. We obtain an explicit solution of Matytsin's equations in the case of Wigner matrices, as well as a general expansion method in the dilute limit, when the spectrum of eigenvalues spreads over very wide regions.

  17. A Health Systems Approach to Integrated Community Case Management of Childhood Illness: Methods and Tools

    PubMed Central

    McGorman, Laura; Marsh, David R.; Guenther, Tanya; Gilroy, Kate; Barat, Lawrence M.; Hammamy, Diaa; Wansi, Emmanuel; Peterson, Stefan; Hamer, Davidson H.; George, Asha

    2012-01-01

    Integrated community case management (iCCM) of childhood illness is an increasingly popular strategy to expand life-saving health services to underserved communities. However, community health approaches vary widely across countries and do not always distribute resources evenly across local health systems. We present a harmonized framework, developed through interagency consultation and review, which supports the design of CCM by using a systems approach. To verify that the framework produces results, we also suggest a list of complementary indicators, including nine global metrics, and a menu of 39 country-specific measures. When used by program managers and evaluators, we propose that the framework and indicators can facilitate the design, implementation, and evaluation of community case management. PMID:23136280

  18. Computational modeling of RNA 3D structures, with the aid of experimental restraints

    PubMed Central

    Magnus, Marcin; Matelska, Dorota; Łach, Grzegorz; Chojnowski, Grzegorz; Boniecki, Michal J; Purta, Elzbieta; Dawson, Wayne; Dunin-Horkawicz, Stanislaw; Bujnicki, Janusz M

    2014-01-01

    In addition to mRNAs whose primary function is transmission of genetic information from DNA to proteins, numerous other classes of RNA molecules exist, which are involved in a variety of functions, such as catalyzing biochemical reactions or performing regulatory roles. In analogy to proteins, the function of RNAs depends on their structure and dynamics, which are largely determined by the ribonucleotide sequence. Experimental determination of high-resolution RNA structures is both laborious and difficult, and therefore, the majority of known RNAs remain structurally uncharacterized. To address this problem, computational structure prediction methods were developed that simulate either the physical process of RNA structure formation (“Greek science” approach) or utilize information derived from known structures of other RNA molecules (“Babylonian science” approach). All computational methods suffer from various limitations that make them generally unreliable for structure prediction of long RNA sequences. However, in many cases, the limitations of computational and experimental methods can be overcome by combining these two complementary approaches with each other. In this work, we review computational approaches for RNA structure prediction, with emphasis on implementations (particular programs) that can utilize restraints derived from experimental analyses. We also list experimental approaches, whose results can be relatively easily used by computational methods. Finally, we describe case studies where computational and experimental analyses were successfully combined to determine RNA structures that would remain out of reach for each of these approaches applied separately. PMID:24785264

  19. The typological approach in child and family psychology: a review of theory, methods, and research.

    PubMed

    Mandara, Jelani

    2003-06-01

    The purpose of this paper was to review the theoretical underpinnings, major concepts, and methods of the typological approach. It was argued that the typological approach offers a systematic, empirically rigorous and reliable way to synthesize the nomothetic variable-centered approach with the idiographic case-centered approach. Recent advances in cluster analysis validation make it a promising method for uncovering natural typologies. This paper also reviewed findings from personality and family studies that have revealed 3 prototypical personalities and parenting styles: Adjusted/Authoritative, Overcontrolled/Authoritarian, and Undercontrolled/Permissive. These prototypes are theorized to be synonymous with attractor basins in psychological state space. The connection between family types and personality structure as well as future directions of typological research were also discussed.

  20. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    Continuing studies associated with the development of the quasi-analytical (QA) sensitivity method for three dimensional transonic flow about wings are presented. Furthermore, initial results using the quasi-analytical approach were obtained and compared to those computed using the finite difference (FD) approach. The basic goals achieved were: (1) carrying out various debugging operations pertaining to the quasi-analytical method; (2) addition of section design variables to the sensitivity equation in the form of multiple right hand sides; (3) reconfiguring the analysis/sensitivity package in order to facilitate the execution of analysis/FD/QA test cases; and (4) enhancing the display of output data to allow careful examination of the results and to permit various comparisons of sensitivity derivatives obtained using the FC/QA methods to be conducted easily and quickly. In addition to discussing the above goals, the results of executing subcritical and supercritical test cases are presented.

  1. Undertaking Individual Transdisciplinary PhD Research for Sustainable Development: Case Studies from South Africa

    ERIC Educational Resources Information Center

    van Breda, John; Musango, Josephine; Brent, Alan

    2016-01-01

    Purpose: This paper aims to improve the understanding of individual transdisciplinary PhD research in a developing country context, focusing on three individual PhD case studies in South Africa. Design/Methodology/Approach: Multiple-case method was used, and three completed transdisciplinary PhD research efforts undertaken at the Stellenbosch…

  2. Multidisciplinary Approaches to Educational Research: Case Studies from Europe and the Developing World. Routledge Research in Education

    ERIC Educational Resources Information Center

    Rizvi, Sadaf, Ed.

    2011-01-01

    This book provides an original perspective on a range of controversial issues in educational and social research through case studies of multi-disciplinary and mixed-method research involving children, teachers, schools and communities in Europe and the developing world. These case studies from researchers "across continents" and…

  3. Indigenous Crisis Counseling in Taiwan: An Exploratory Qualitative Case Study of an Expert Therapist

    ERIC Educational Resources Information Center

    Kuo, Ben C. H.; Hsu, Wei-Su; Lai, Nien-Hwa

    2011-01-01

    In this study, we adopted a single qualitative case study method to explore and examine indigenous approaches to crisis counseling in Taiwan, through the distinct lens of an expert Taiwanese counseling psychologist. In-depth, open-ended interviews were conducted with the psychologist (as the case) to document her lived clinical experiences…

  4. Ballet as Somatic Practice: A Case Study Exploring the Integration of Somatic Practices in Ballet Pedagogy

    ERIC Educational Resources Information Center

    Berg, Tanya

    2017-01-01

    This case study explores one teacher's integration of Alexander Technique and the work of neuromuscular retrainer Irene Dowd in ballet pedagogy to establish a somatic approach to teaching, learning, and performing ballet technique. This case study highlights the teacher's unique teaching method called IMAGE TECH for dancers (ITD) and offers…

  5. Evaluation of an Interactive Case-Based Online Network (ICON) in a Problem Based Learning Environment

    ERIC Educational Resources Information Center

    Nathoo, Arif N.; Goldhoff, Patricia; Quattrochi, James J.

    2005-01-01

    Purpose: This study sought to assess the introduction of a web-based innovation in medical education that complements traditional problem-based learning curricula. Utilizing the case method as its fundamental educational approach, the Interactive Case-based Online Network (ICON) allows students to interact with each other, faculty and a virtual…

  6. Mapping Resource Selection Functions in Wildlife Studies: Concerns and Recommendations

    PubMed Central

    Morris, Lillian R.; Proffitt, Kelly M.; Blackburn, Jason K.

    2018-01-01

    Predicting the spatial distribution of animals is an important and widely used tool with applications in wildlife management, conservation, and population health. Wildlife telemetry technology coupled with the availability of spatial data and GIS software have facilitated advancements in species distribution modeling. There are also challenges related to these advancements including the accurate and appropriate implementation of species distribution modeling methodology. Resource Selection Function (RSF) modeling is a commonly used approach for understanding species distributions and habitat usage, and mapping the RSF results can enhance study findings and make them more accessible to researchers and wildlife managers. Currently, there is no consensus in the literature on the most appropriate method for mapping RSF results, methods are frequently not described, and mapping approaches are not always related to accuracy metrics. We conducted a systematic review of the RSF literature to summarize the methods used to map RSF outputs, discuss the relationship between mapping approaches and accuracy metrics, performed a case study on the implications of employing different mapping methods, and provide recommendations as to appropriate mapping techniques for RSF studies. We found extensive variability in methodology for mapping RSF results. Our case study revealed that the most commonly used approaches for mapping RSF results led to notable differences in the visual interpretation of RSF results, and there is a concerning disconnect between accuracy metrics and mapping methods. We make 5 recommendations for researchers mapping the results of RSF studies, which are focused on carefully selecting and describing the method used to map RSF studies, and relating mapping approaches to accuracy metrics. PMID:29887652

  7. A framework for the social valuation of ecosystem services.

    PubMed

    Felipe-Lucia, María R; Comín, Francisco A; Escalera-Reyes, Javier

    2015-05-01

    Methods to assess ecosystem services using ecological or economic approaches are considerably better defined than methods for the social approach. To identify why the social approach remains unclear, we reviewed current trends in the literature. We found two main reasons: (i) the cultural ecosystem services are usually used to represent the whole social approach, and (ii) the economic valuation based on social preferences is typically included in the social approach. Next, we proposed a framework for the social valuation of ecosystem services that provides alternatives to economics methods, enables comparison across studies, and supports decision-making in land planning and management. The framework includes the agreements emerged from the review, such as considering spatial-temporal flows, including stakeholders from all social ranges, and using two complementary methods to value ecosystem services. Finally, we provided practical recommendations learned from the application of the proposed framework in a case study.

  8. Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2016-01-01

    Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.

  9. Bridging the Gaps between Learning and Teaching through Recognition of Students' Learning Approaches: A Case Study

    ERIC Educational Resources Information Center

    Malie, Senian; Akir, Oriah

    2012-01-01

    Learning approaches, learning methods and learning environments have different effects on students? academic performance. However, they are not the sole factors that impact students? academic achievement. The aims of this research are three-fold: to determine the learning approaches preferred by most students and the impact of the learning…

  10. Self-Monitoring Approach for the Modification of Smoking Behavior: A Case Study.

    ERIC Educational Resources Information Center

    Faherty, John K.

    This paper presents a review of relevant literature on treatment approaches for the modification of smoking behavior, followed by an outline of an approach developed by the author to decrease his own rate of cigarette smoking. Studies are reviewed which have used various treatment methods: use of electric shock, satiation and/or use of cigarette…

  11. Creation of Exercises for Team-Based Learning in Business

    ERIC Educational Resources Information Center

    Timmerman, John E.; Morris, R. Franklin, Jr.

    2015-01-01

    Team-based learning (TBL) is an approach that builds on both the case method and problem-based learning and has been widely adopted in the sciences and healthcare disciplines. In recent years business disciplines have also discovered the value of this approach. One of the key characteristics of the team-based learning approach consists of…

  12. A stochastic approach for automatic generation of urban drainage systems.

    PubMed

    Möderl, M; Butler, D; Rauch, W

    2009-01-01

    Typically, performance evaluation of new developed methodologies is based on one or more case studies. The investigation of multiple real world case studies is tedious and time consuming. Moreover extrapolating conclusions from individual investigations to a general basis is arguable and sometimes even wrong. In this article a stochastic approach is presented to evaluate new developed methodologies on a broader basis. For the approach the Matlab-tool "Case Study Generator" is developed which generates a variety of different virtual urban drainage systems automatically using boundary conditions e.g. length of urban drainage system, slope of catchment surface, etc. as input. The layout of the sewer system is based on an adapted Galton-Watson branching process. The sub catchments are allocated considering a digital terrain model. Sewer system components are designed according to standard values. In total, 10,000 different virtual case studies of urban drainage system are generated and simulated. Consequently, simulation results are evaluated using a performance indicator for surface flooding. Comparison between results of the virtual and two real world case studies indicates the promise of the method. The novelty of the approach is that it is possible to get more general conclusions in contrast to traditional evaluations with few case studies.

  13. IT: An Effective Pedagogic Tool in the Teaching of Quantitative Methods in Management.

    ERIC Educational Resources Information Center

    Nadkami, Sanjay M.

    1998-01-01

    Examines the possibility of supplementing conventional pedagogic methods with information technology-based teaching aids in the instruction of quantitative methods to undergraduate students. Considers the case for a problem-based learning approach, and discusses the role of information technology. (Author/LRW)

  14. Creating an Equitable Classroom Environment: A Case Study of a Preservice Elementary Teacher Learning What It Means to "Do Inquiry"

    ERIC Educational Resources Information Center

    Villa, Elsa Q.; Baptiste, H. Prentice

    2014-01-01

    In this article, the authors present a case study of a preservice teacher who participated in a two-semester course sequence of elementary science and mathematics methods spanning one academic year. These two courses were taught by the first author and embedded a pedagogical approach grounded in inquiry methods. The purpose of this study was to…

  15. Traditional and Alternative Approaches to the Method of Situational Analysis in Russia: Evidence from the Case Study "Istanbul in the Life and Works of Martiros Saryan"

    ERIC Educational Resources Information Center

    Fedotova, Olga; Ermakov, Pavel; Latun, Vladimir; Hovhannisyan, Haykaz; Avanesyan, Grant

    2017-01-01

    The article analyzes the transformation of the methodological toolkit for teaching humanities and sciences in the Russian Federation. The method of case study, being widely spread in modern higher education research, is used as an example to illustrate the attempts to implement the best practices of foreign educational technology into tertiary…

  16. Feasibility of the AML profiler (Skyline™ Array) for patient risk stratification in a multicentre trial: a preliminary comparison with the conventional approach.

    PubMed

    Nomdedéu, Josep F; Puigdecanet, Eulalia; Bussaglia, Elena; Hernández, Juan José; Carricondo, Maite; Estivill, Camino; Martí-Tutusaus, Josep Maria; Tormo, Mar; Zamora, Lurdes; Serrano, Elena; Perea, Granada; de Llano, Maria Paz Queipo; García, Antoni; Sánchez-Ortega, Isabel; Ribera, Josep Maria; Nonell, Lara; Aventin, Anna; Solé, Francesc; Brunet, Maria Salut; Sierra, Jorge

    2017-12-01

    Deoxyribonucleic acid microarrays allow researchers to measure mRNA levels of thousands of genes in a single experiment and could be useful for diagnostic purposes in patients with acute myeloid leukaemia (AML). We assessed the feasibility of the AML profiler (Skyline™ Array) in genetic stratification of patients with de novo AML and compared the results with those obtained using the standard cytogenetic and molecular approach. Diagnostic bone marrow from 31 consecutive de novo AML cases was used to test MLL-PTD, FLT3-ITD and TKD, NPM1 and CEBPAdm mutations. Purified RNA was used to assess RUNX1-RUNX1T1, PML-RARα and CBFβ-MYH11 rearrangements. RNA remnants underwent gene expression profiling analysis using the AML profiler, which detects chromosomal aberrations: t(8;21), t(15;17), inv(16), mutations (CEBPAdm, ABD-NPM1) and BAALC and EVI1 expression. Thirty cases were successfully analysed with both methods. Five cases had FLT3-ITD. In one case, a t(8;21) was correctly detected by both methods. Four cases had inv(16); in one, the RNA quality was unsatisfactory and it was not hybridized, and in the other three, the AML profiler detected the genetic lesion - this being a rare type I translocation in one case. Two cases with acute promyelocytic leukaemia were diagnosed by both methods. Results for NPM1 mutations were concordant in all but two cases (2/11, non-ABD mutations). Analysis of costs and turnaround times showed that the AML profiler was no more expensive than the conventional molecular approach. These results suggest that the AML profiler could be useful in multicentre trials to rapidly identify patients with AML with a good prognosis. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. A kernel regression approach to gene-gene interaction detection for case-control studies.

    PubMed

    Larson, Nicholas B; Schaid, Daniel J

    2013-11-01

    Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.

  18. SARP: a value-based approach to hospice admissions triage.

    PubMed

    MacDonald, D

    1995-01-01

    As hospices become established and case referrals increase, many programs are faced with the necessity of instituting waiting lists. Prioritizing cases for order of admission requires a triage method that is rational, fair, and consistent. This article describes the SARP method of hospice admissions triage, which evaluates prospective cases according to seniority, acuity, risk, and political significance. SARP's essential features, operative assumptions, advantages, and limitations are discussed, as well as the core hospice values which underlie its use. The article concludes with a call for trial and evaluation of SARP in other hospice settings.

  19. Benefit Indicators for Flood Regulation Services of Wetlands: A Modeling Approach

    EPA Science Inventory

    This report describes a method for developing indicators of the benefits of flood regulation services of wetlands and presents a companion case study. We demonstrate our approach through an application to the Woonasquatucket River watershed in northern Rhode Island. This work is ...

  20. Value Contestations in Development Intervention: Community Development and Sustainable Livelihoods Approaches.

    ERIC Educational Resources Information Center

    Arce, Alberto

    2003-01-01

    Both community development and sustainable livelihood approaches ignore value contestations that underlie people's interests and experiences. A case from Bolivia demonstrates that local values, social relations, actions, and language strategies must underlie policy and method in development. (Contains 28 references.) (SK)

  1. Is Science Logical?

    ERIC Educational Resources Information Center

    Pease, Craig M.; Bull, J. J.

    1992-01-01

    Offers a concise, abstract description of the scientific method different from the historical, philosophical, and case-study approaches, which lead to comprehension of this method. Discusses features of scientific models, dynamic interactions underlying scientific progress, ways that scientist successfully understand nature, mechanisms for…

  2. [Cochlear implantation through the middle fossa approach].

    PubMed

    Szyfter, W; Colletti, V; Pruszewicz, A; Kopeć, T; Szymiec, E; Kawczyński, M; Karlik, M

    2001-01-01

    The inner part of cochlear implant is inserted into inner ear during surgery through mastoid and middle ear. It is a classical method, used in the majority cochlear centers in the world. This is not a suitable method in case of chronic otitis media and middle ear malformation. In these cases Colletti proposed the middle fossa approach and cochlear implant insertion omitting middle ear structures. In patient with bilateral chronic otitis media underwent a few ears operations without obtaining dry postoperative cavity. Cochlear implantation through the middle fossa approach was performed in this patient. The bone fenster was cut, temporal lobe was bent and petrosus pyramid upper surface was exposed. When the superficial petrosal greater nerve, facial nerve and arcuate eminence were localised, the cochlear was open in the basal turn and electrode were inserted. The patient achieves good results in the postoperative speech rehabilitation. It confirmed Colletti tesis that deeper electrode insertion in the cochlear implantation through the middle fossa approach enable use of low and middle frequencies, which are very important in speech understanding.

  3. Case Managers' Perspectives On What They Need To Do Their Job

    PubMed Central

    Eack, Shaun M.; Greeno, Catherine G.; Christian-Michaels, Stephen; Dennis, Amy; Anderson, Carol M.

    2013-01-01

    Objective To identify the perceived training needs of case managers working on community support teams in a community mental health center serving a semi-rural/suburban area. Methods Semi-structured interviews were conducted with 18 case managers and 3 supervisors to inquire about areas of training need in case management. Interviews were coded and analyzed for common themes regarding training needs and methods of training improvement. Results Identified training needs called for a hands-on, back to basics approach that included education on the symptoms of severe mental illness, co-morbid substance use problems, and methods of engaging consumers. A mentoring model was proposed as a potential vehicle for disseminating knowledge in these domains. Conclusions Case managers identify significant training needs that would address their basic understanding of severe mental illness. Programs targeting these needs may result in improved outcomes for case managers and the individuals with psychiatric disabilities. PMID:19346211

  4. A Comparison of Two Approaches to Safety Analysis Based on Use Cases

    NASA Astrophysics Data System (ADS)

    Stålhane, Tor; Sindre, Guttorm

    Engineering has a long tradition in analyzing the safety of mechanical, electrical and electronic systems. Important methods like HazOp and FMEA have also been adopted by the software engineering community. The misuse case method, on the other hand, has been developed by the software community as an alternative to FMEA and preliminary HazOp for software development. To compare the two methods misuse case and FMEA we have run a small experiment involving 42 third year software engineering students. In the experiment, the students should identify and analyze failure modes from one of the use cases for a commercial electronic patient journals system. The results of the experiment show that on the average, the group that used misuse cases identified and analyzed more user related failure modes than the persons using FMEA. In addition, the persons who used the misuse cases scored better on perceived ease of use and intention to use.

  5. Prediction of a service demand using combined forecasting approach

    NASA Astrophysics Data System (ADS)

    Zhou, Ling

    2017-08-01

    Forecasting facilitates cutting down operational and management costs while ensuring service level for a logistics service provider. Our case study here is to investigate how to forecast short-term logistic demand for a LTL carrier. Combined approach depends on several forecasting methods simultaneously, instead of a single method. It can offset the weakness of a forecasting method with the strength of another, which could improve the precision performance of prediction. Main issues of combined forecast modeling are how to select methods for combination, and how to find out weight coefficients among methods. The principles of method selection include that each method should apply to the problem of forecasting itself, also methods should differ in categorical feature as much as possible. Based on these principles, exponential smoothing, ARIMA and Neural Network are chosen to form the combined approach. Besides, least square technique is employed to settle the optimal weight coefficients among forecasting methods. Simulation results show the advantage of combined approach over the three single methods. The work done in the paper helps manager to select prediction method in practice.

  6. Scientists' attitudes on science and values: Case studies and survey methods in philosophy of science.

    PubMed

    Steel, Daniel; Gonnerman, Chad; O'Rourke, Michael

    2017-06-01

    This article examines the relevance of survey data of scientists' attitudes about science and values to case studies in philosophy of science. We describe two methodological challenges confronting such case studies: 1) small samples, and 2) potential for bias in selection, emphasis, and interpretation. Examples are given to illustrate that these challenges can arise for case studies in the science and values literature. We propose that these challenges can be mitigated through an approach in which case studies and survey methods are viewed as complementary, and use data from the Toolbox Dialogue Initiative to illustrate this claim. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Improved Shaping Approach to the Preliminary Design of Low-Thrust Trajectories

    NASA Astrophysics Data System (ADS)

    Novak, D. M.; Vasile, M.

    2011-01-01

    This paper presents a general framework for the development of shape-based approaches to low-thrust trajectory design. A novel shaping method, based on a three-dimensional description of the trajectory in spherical coordinates, is developed within this general framework. Both the exponential sinusoid and the inverse polynomial shaping are demonstrated to be particular two-dimensional cases of the spherical one. The pseudoequinoctial shaping is revisited within the new framework, and the nonosculating nature of the pseudoequinoctial elements is analyzed. A two-step approach is introduced to solve the time of flight constraint, related to the design of low-thrust arcs with boundary constraints for both spherical and pseudoequinoctial shaping. The solution derived from the shaping approach is improved with a feedback linear-quadratic controller and compared against a direct collocation method based on finite elements in time. The new shaping approach and the combination of shaping and linear-quadratic controller are tested on three case studies: a mission to Mars, a mission to asteroid 1989ML, a mission to comet Tempel-1, and a mission to Neptune.

  8. Problem-based learning within endocrine physiology lectures.

    PubMed

    Walters, M R

    2001-12-01

    Methods were needed to improve the interest of medical students in the 10-lecture Endocrine Physiology block at the end of the second semester of study. Other incentives for improvement included the possibility of attracting students into endocrine research electives and the pressure to improve teaching approaches that results from the high tuition they pay. The principal approach adopted was that of whole class problem-based learning sessions (PBLS) in which the lecture period begins with a brief overview of one to three simplified cases, followed by the usual didactic lecture. At the end of the lecture, each PBL case is read in detail, with several questions posed to the students. Their answers are then used to reinforce concepts from the lecture material. This method can also provide some continuity between lectures, either by using a case in several lectures to illustrate different points, or by posing a question at the beginning of class that illustrates a point from the prior lecture. The outcome of this approach has been very successful: student evaluations of the lecture block and their attendance have significantly improved.

  9. Inverse transonic airfoil design methods including boundary layer and viscous interaction effects

    NASA Technical Reports Server (NTRS)

    Carlson, L. A.

    1979-01-01

    The development and incorporation into TRANDES of a fully conservative analysis method utilizing the artificial compressibility approach is described. The method allows for lifting cases and finite thickness airfoils and utilizes a stretched coordinate system. Wave drag and massive separation studies are also discussed.

  10. An Analysis of Class II Supplies Requisitions in the Korean Army’s Organizational Supply

    DTIC Science & Technology

    2009-03-26

    five methods for qualitative research : Case study , Ethnography , 45 Phenomenological study , Grounded theory , and...Approaches .. 42 Table 9 Five Qualitative Research Methods ..................................................................... 45 Table 10 Six...Content analysis. Table 9 provides a brief overview of the five methods . Table 9 Five Qualitative

  11. On Correspondence of BRST-BFV, Dirac, and Refined Algebraic Quantizations of Constrained Systems

    NASA Astrophysics Data System (ADS)

    Shvedov, O. Yu.

    2002-11-01

    The correspondence between BRST-BFV, Dirac, and refined algebraic (group averaging, projection operator) approaches to quantizing constrained systems is analyzed. For the closed-algebra case, it is shown that the component of the BFV wave function corresponding to maximal (minimal) value of number of ghosts and antighosts in the Schrodinger representation may be viewed as a wave function in the refined algebraic (Dirac) quantization approach. The Giulini-Marolf group averaging formula for the inner product in the refined algebraic quantization approach is obtained from the Batalin-Marnelius prescription for the BRST-BFV inner product, which should be generally modified due to topological problems. The considered prescription for the correspondence of states is observed to be applicable to the open-algebra case. The refined algebraic quantization approach is generalized then to the case of nontrivial structure functions. A simple example is discussed. The correspondence of observables for different quantization methods is also investigated.

  12. Spatial variation in mortality risk for haematological malignancies near a petrochemical refinery: a population-based case-control study

    PubMed Central

    Di Salvo, Francesca; Meneghini, Elisabetta; Vieira, Veronica; Baili, Paolo; Mariottini, Mauro; Baldini, Marco; Micheli, Andrea; Sant, Milena

    2015-01-01

    Introduction The study investigated the geographic variation of mortality risk for hematological malignancies (HMs) in order to identify potential high-risk areas near an Italian petrochemical refinery. Material and methods A population-based case-control study was conducted and residential histories for 171 cases and 338 sex- and age-matched controls were collected. Confounding factors were obtained from interviews with consenting relatives for 109 HM deaths and 267 controls. To produce risk mortality maps, two different approaches were applied. We mapped (1) adptive kernel density relative risk estimation (KDE) for case-control studies which estimates a spatial relative risk function using the ratio between cases and controls’ densities, and (2) estimated odds ratios for case-control study data using generalized additive models (GAMs) to smooth the effect of location, a proxy for exposure, while adjusting for confounding variables. Results No high-risk areas for HM mortality were identified among all subjects (men and women combined), by applying both approaches. Using the adaptive KDE approach, we found a significant increase in death risk only among women in a large area 2–6 km southeast of the refinery and the application of GAMs also identified a similarly-located significant high-risk area among women only (global p-value<0.025). Potential confounding risk factors we considered in the GAM did not alter the results. Conclusion Both approaches identified a high-risk area close to the refinery among women only. Those spatial methods are useful tools for public policy management to determine priority areas for intervention. Our findings suggest several directions for further research in order to identify other potential environmental exposures that may be assessed in forthcoming studies based on detailed exposure modeling. PMID:26073202

  13. Merging for Particle-Mesh Complex Particle Kinetic Modeling of the Multiple Plasma Beams

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.

    2011-01-01

    We suggest a merging procedure for the Particle-Mesh Complex Particle Kinetic (PMCPK) method in case of inter-penetrating flow (multiple plasma beams). We examine the standard particle-in-cell (PIC) and the PMCPK methods in the case of particle acceleration by shock surfing for a wide range of the control numerical parameters. The plasma dynamics is described by a hybrid (particle-ion-fluid-electron) model. Note that one may need a mesh if modeling with the computation of an electromagnetic field. Our calculations use specified, time-independent electromagnetic fields for the shock, rather than self-consistently generated fields. While a particle-mesh method is a well-verified approach, the CPK method seems to be a good approach for multiscale modeling that includes multiple regions with various particle/fluid plasma behavior. However, the CPK method is still in need of a verification for studying the basic plasma phenomena: particle heating and acceleration by collisionless shocks, magnetic field reconnection, beam dynamics, etc.

  14. Testing independence of bivariate interval-censored data using modified Kendall's tau statistic.

    PubMed

    Kim, Yuneung; Lim, Johan; Park, DoHwan

    2015-11-01

    In this paper, we study a nonparametric procedure to test independence of bivariate interval censored data; for both current status data (case 1 interval-censored data) and case 2 interval-censored data. To do it, we propose a score-based modification of the Kendall's tau statistic for bivariate interval-censored data. Our modification defines the Kendall's tau statistic with expected numbers of concordant and disconcordant pairs of data. The performance of the modified approach is illustrated by simulation studies and application to the AIDS study. We compare our method to alternative approaches such as the two-stage estimation method by Sun et al. (Scandinavian Journal of Statistics, 2006) and the multiple imputation method by Betensky and Finkelstein (Statistics in Medicine, 1999b). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake; Chakraborty, Sudipta; Lauss, Georg

    This paper presents a concise description of state-of-the-art real-time simulation-based testing methods and demonstrates how they can be used independently and/or in combination as an integrated development and validation approach for smart grid DERs and systems. A three-part case study demonstrating the application of this integrated approach at the different stages of development and validation of a system-integrated smart photovoltaic (PV) inverter is also presented. Laboratory testing results and perspectives from two international research laboratories are included in the case study.

  16. Case-Based Teaching in a Bilingual Context: Perceptions of Business Faculty in Hong Kong

    ERIC Educational Resources Information Center

    Jackson, Jane

    2004-01-01

    Case methods of teaching are now common in business education programs worldwide. This problem-based approach, however, can pose unique challenges in bilingual contexts, especially if the students are more familiar with transmission modes of learning. This paper focuses on an investigation of case-based teaching in Hong Kong. By way of surveys and…

  17. Investigation of Geobase Implementation Issues: Case Study of Information Resource Management

    DTIC Science & Technology

    2004-03-01

    case study , ethnography , phenomenological study , and grounded theory . “Of...all the research designs [described] … a grounded theory study is the one that is least likely to begin from a 40 Use this approach if...the treatments explained, the case study method best matches the objectives of this research (Leedy & Ormrod, 2001). Yin’s criteria for

  18. Surgical approach to the intrathoracic goiter

    PubMed Central

    Bekerman, Inessa; Basel, Jabarin; Peer, Michael

    2018-01-01

    Objective In a retrospective study, the authors analyzed the surgical approach to the intrathoracic goiter to avoid sternotomy or thoracotomy. Methods We selected 70 intrathoracic cases of multinodular goiter out of 988 cases of thyroidectomy and compared them with cervical goiter cases. Surgical technique, results, and postsurgical complications were assessed. Results The analyzed cases presented the retrosternal goiter (n = 53; 75.7%), the retrotracheal goiter (n = 9; 12.8%), and the retroesophageal goiter (n = 8; 11.4%). Complaining of chest pressure or discomfort was specific for intrathoracic cases (50%; 35 of 70). All goiters except one were removed via cervical incision. The surgeons used head reclination and isthmus dissection when removing sizable goiters. Mean weight of goiters was 180 g. The recurrent laryngeal nerve was more often temporarily damaged in intrathoracic cases in comparison with cervical cases (4.3% vs. 2.8%, P = .04), but the difference in permanent injury was less significant (P = .09). The incidence of temporary hypoparathyroidism was significantly higher in intrathoracic cases (P = .01). Conclusion In cases of multinodular goiter the goiters of various extensions can be successfully removed via the cervical incision in most of the cases even if they occupy the retrosternal, retrotracheal, or retroesophageal position. The transthoracic approaches and sternotomy might be justified in malignant cases. Level of Evidence 4. PMID:29721546

  19. A comparison of methods for estimating the random effects distribution of a linear mixed model.

    PubMed

    Ghidey, Wendimagegn; Lesaffre, Emmanuel; Verbeke, Geert

    2010-12-01

    This article reviews various recently suggested approaches to estimate the random effects distribution in a linear mixed model, i.e. (1) the smoothing by roughening approach of Shen and Louis,(1) (2) the semi-non-parametric approach of Zhang and Davidian,(2) (3) the heterogeneity model of Verbeke and Lesaffre( 3) and (4) a flexible approach of Ghidey et al. (4) These four approaches are compared via an extensive simulation study. We conclude that for the considered cases, the approach of Ghidey et al. (4) often shows to have the smallest integrated mean squared error for estimating the random effects distribution. An analysis of a longitudinal dental data set illustrates the performance of the methods in a practical example.

  20. Surgical Therapy of Cervical Spine Fracture in Patients With Ankylosing Spondylitis

    PubMed Central

    Ma, Jun; Wang, Ce; Zhou, Xuhui; Zhou, Shengyuan; Jia, Lianshun

    2015-01-01

    Abstract The present study aimed to explore surgical treatments and assess the effects based on the features of cervical spine fracture in patients with ankylosing spondylitis (AS) and to summarize the experiences in perioperative management. Retrospective analysis was performed in 25 AS patients with cervical spine fracture treated in our hospital from January 2011 to December 2013. The patients were divided according to fracture segments, including 4 cases at C4 to C5, 8 cases at C5 to C6, and 13 cases at C6 to C7. Among them, 12 belonged to I type, 5 to II type, and 8 to III type based on the improved classification method for AS cervical spine fracture. The Subaxial Cervical Spine Injury Classification score for these patients was 7.2 ± 1.3, and the assessment of their neurological function states showed 6 patients (24%) were in American Spinal Injury Association (ASIA) A grade, 1 (4%) in ASIA B grade, 3 (12%) in ASIA C grade, 12 (48%) in ASIA D grade, and 3 (12%) in ASIA E grade. Surgical methods contained simple anterior approach alone, posterior approach alone, and combined posterior–anterior or anterior–posterior approach. The average duration of patients’ hospital stay was 38.6 ± 37.6, and the first surgical methods were as follows: anterior approach alone on 6 cases, posterior surgery alone on 9 cases, and combined posterior–anterior or anterior–posterior approach on 10 patients. The median segments of fixation and fusion were 4.1 ± 1.4 sections. Thirteen patients developed complications. During 2 to 36 months of postoperative follow-up, 1 patient died of respiratory failure caused by pulmonary infections 2 months after leaving hospital. At the end of the follow-up, bone graft fusion was achieved in the rest of patients, and obvious looseness or migration of internal fixation was not observed. In addition, the preoperative neurological injury in 12 patients (54.5%) was also alleviated in different levels. AS cervical spine fracture, an unstable fracture, should be treated with operation, and satisfactory effects will be achieved after the individualized surgical treatment according to the improved classification method for AS cervical spine fracture. PMID:26554765

  1. Project Management Consultancy (PMC) procurement approach: Supplier's evaluation and selection dilemma

    NASA Astrophysics Data System (ADS)

    Nawi, Mohd Nasrun Mohd; Azimi, Mohd Azrulfitri; Pozin, Mohd Affendi Ahmad; Osman, Wan Nadri; Anuar, Herman Shah

    2016-08-01

    Project Management Consultancy (PMC) is part of the management oriented procurement method in which a sole consultant is hired by the client to deal with the contactors in place of the client. Appointing contractors in this method or approach looks to be interesting as client could play a pivotal role in evaluating and selecting the supplier/contractor for the work package. In some cases, client gives the authority for the PMC to hire the supplier/contractor of their choice while in some cases the client is the one who made the decision. This research paper seeks to investigate the dilemma arises from this situation and for the purpose of this research, a real case study was studied to assess the impacts of such dilemma to the performance of the project. Recommendations on how to tackle the dilemma will also be addressed in the later part of this research paper.

  2. Dependability analysis of parallel systems using a simulation-based approach. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Sawyer, Darren Charles

    1994-01-01

    The analysis of dependability in large, complex, parallel systems executing real applications or workloads is examined in this thesis. To effectively demonstrate the wide range of dependability problems that can be analyzed through simulation, the analysis of three case studies is presented. For each case, the organization of the simulation model used is outlined, and the results from simulated fault injection experiments are explained, showing the usefulness of this method in dependability modeling of large parallel systems. The simulation models are constructed using DEPEND and C++. Where possible, methods to increase dependability are derived from the experimental results. Another interesting facet of all three cases is the presence of some kind of workload of application executing in the simulation while faults are injected. This provides a completely new dimension to this type of study, not possible to model accurately with analytical approaches.

  3. Individualized drug dosing using RBF-Galerkin method: Case of anemia management in chronic kidney disease.

    PubMed

    Mirinejad, Hossein; Gaweda, Adam E; Brier, Michael E; Zurada, Jacek M; Inanc, Tamer

    2017-09-01

    Anemia is a common comorbidity in patients with chronic kidney disease (CKD) and is frequently associated with decreased physical component of quality of life, as well as adverse cardiovascular events. Current treatment methods for renal anemia are mostly population-based approaches treating individual patients with a one-size-fits-all model. However, FDA recommendations stipulate individualized anemia treatment with precise control of the hemoglobin concentration and minimal drug utilization. In accordance with these recommendations, this work presents an individualized drug dosing approach to anemia management by leveraging the theory of optimal control. A Multiple Receding Horizon Control (MRHC) approach based on the RBF-Galerkin optimization method is proposed for individualized anemia management in CKD patients. Recently developed by the authors, the RBF-Galerkin method uses the radial basis function approximation along with the Galerkin error projection to solve constrained optimal control problems numerically. The proposed approach is applied to generate optimal dosing recommendations for individual patients. Performance of the proposed approach (MRHC) is compared in silico to that of a population-based anemia management protocol and an individualized multiple model predictive control method for two case scenarios: hemoglobin measurement with and without observational errors. In silico comparison indicates that hemoglobin concentration with MRHC method has less variation among the methods, especially in presence of measurement errors. In addition, the average achieved hemoglobin level from the MRHC is significantly closer to the target hemoglobin than that of the other two methods, according to the analysis of variance (ANOVA) statistical test. Furthermore, drug dosages recommended by the MRHC are more stable and accurate and reach the steady-state value notably faster than those generated by the other two methods. The proposed method is highly efficient for the control of hemoglobin level, yet provides accurate dosage adjustments in the treatment of CKD anemia. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Bayesian analysis of time-series data under case-crossover designs: posterior equivalence and inference.

    PubMed

    Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay

    2013-12-01

    Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations. © 2013, The International Biometric Society.

  5. The Case for Individualizing Behavior Management Approaches in Inclusive Classrooms

    ERIC Educational Resources Information Center

    Grossman, Herbert

    2005-01-01

    In today's heterogeneous classrooms, one-method-fits-all-students behavior management approaches are ineffective and often harmful. To succeed with all of their students, teachers should determine whether students have emotional disorders, conduct/behavior disorders, robust male-typical behavior patterns, culturally influenced behavior, learning…

  6. Linear time-dependent reference intervals where there is measurement error in the time variable-a parametric approach.

    PubMed

    Gillard, Jonathan

    2015-12-01

    This article re-examines parametric methods for the calculation of time specific reference intervals where there is measurement error present in the time covariate. Previous published work has commonly been based on the standard ordinary least squares approach, weighted where appropriate. In fact, this is an incorrect method when there are measurement errors present, and in this article, we show that the use of this approach may, in certain cases, lead to referral patterns that may vary with different values of the covariate. Thus, it would not be the case that all patients are treated equally; some subjects would be more likely to be referred than others, hence violating the principle of equal treatment required by the International Federation for Clinical Chemistry. We show, by using measurement error models, that reference intervals are produced that satisfy the requirement for equal treatment for all subjects. © The Author(s) 2011.

  7. Space-Based Identification of Archaeological Illegal Excavations and a New Automatic Method for Looting Feature Extraction in Desert Areas

    NASA Astrophysics Data System (ADS)

    Lasaponara, Rosa; Masini, Nicola

    2018-06-01

    The identification and quantification of disturbance of archaeological sites has been generally approached by visual inspection of optical aerial or satellite pictures. In this paper, we briefly summarize the state of the art of the traditionally satellite-based approaches for looting identification and propose a new automatic method for archaeological looting feature extraction approach (ALFEA). It is based on three steps: the enhancement using spatial autocorrelation, unsupervised classification, and segmentation. ALFEA has been applied to Google Earth images of two test areas, selected in desert environs in Syria (Dura Europos), and in Peru (Cahuachi-Nasca). The reliability of ALFEA was assessed through field surveys in Peru and visual inspection for the Syrian case study. Results from the evaluation procedure showed satisfactory performance from both of the two analysed test cases with a rate of success higher than 90%.

  8. A comparative assessment of statistical methods for extreme weather analysis

    NASA Astrophysics Data System (ADS)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of this study are of relevance for a broad range of environmental variables, including meteorological and hydrological quantities.

  9. BRST-BFV method for nonstationary systems

    NASA Astrophysics Data System (ADS)

    García, J. Antonio; Vergara, J. David; Urrutia, Luis F.

    1995-05-01

    Starting from an associated reparametrization-invariant action, the generalization of the BRST-BFV method for the case of nonstationary systems is constructed. The extension of the Batalin-Tyutin conversional approach is also considered in the nonstationary case. In order to illustrate these ideas, the propagator for the time-dependent two-dimensional rotor is calculated by reformulating the problem as a system with only first-class constraints and subsequently using the BRST-BFV prescription previously obtained.

  10. Comparative Issues and Methods in Organizational Diagnosis. Report 1

    DTIC Science & Technology

    1977-11-01

    nsiltlple discriminant with a dacision-tree approach and ftund t~hat: (a) on single a~ssgninents the doe1slon’-tme iapproadh shooed a greater degree of... Single Case ..... 13 Towaru Relevant Research .............. 15 A Stock-Taking and Some Implications .......... 25 The Three Methods in Detail...to the next. . Judges are unreliable, in the sense that the same case might not be judged the same way twice in successi on. . Judges have stereotypes

  11. A Rawlsian Approach to Distribute Responsibilities in Networks

    PubMed Central

    2009-01-01

    Due to their non-hierarchical structure, socio-technical networks are prone to the occurrence of the problem of many hands. In the present paper an approach is introduced in which people’s opinions on responsibility are empirically traced. The approach is based on the Rawlsian concept of Wide Reflective Equilibrium (WRE) in which people’s considered judgments on a case are reflectively weighed against moral principles and background theories, ideally leading to a state of equilibrium. Application of the method to a hypothetical case with an artificially constructed network showed that it is possible to uncover the relevant data to assess a consensus amongst people in terms of their individual WRE. It appeared that the moral background theories people endorse are not predictive for their actual distribution of responsibilities but that they indicate ways of reasoning and justifying outcomes. Two ways of ascribing responsibilities were discerned, corresponding to two requirements of a desirable responsibility distribution: fairness and completeness. Applying the method triggered learning effects, both with regard to conceptual clarification and moral considerations, and in the sense that it led to some convergence of opinions. It is recommended to apply the method to a real engineering case in order to see whether this approach leads to an overlapping consensus on a responsibility distribution which is justifiable to all and in which no responsibilities are left unfulfilled, therewith trying to contribute to the solution of the problem of many hands. PMID:19626463

  12. A comparison of approaches for finding minimum identifying codes on graphs

    NASA Astrophysics Data System (ADS)

    Horan, Victoria; Adachi, Steve; Bak, Stanley

    2016-05-01

    In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.

  13. The "Push-Pull" Approach to Fast-Track Management Development: A Case Study in Scientific Publishing

    ERIC Educational Resources Information Center

    Fojt, Martin; Parkinson, Stephen; Peters, John; Sandelands, Eric

    2008-01-01

    Purpose: The purpose of this paper is to explore how a medium sized business has addressed what it has termed a "push-pull" method of management and organization development, based around an action learning approach. Design/methodology/approach: The paper sets out a methodology that other SMEs might look to replicate in their management and…

  14. Chronic total occlusion in ostium of right coronary artery – retrograde approach as the first-choice method of revascularization?

    PubMed Central

    Kameczura, Tomasz; Surowiec, Sławomir; Januś, Bogdan; Derlaga, Bogusław; Dudek, Dariusz; Czarnecka, Danuta

    2013-01-01

    Recanalization of chronic total occlusion (CTO) located in the ostium may require the operator's ability to use the retrograde approach. We present a case of opening a chronically occluded right coronary artery (RCA) by the retrograde approach after an unsuccessful attempt of recanalization by classic antegrade technique. PMID:24570749

  15. Automated Surgical Approach Planning for Complex Skull Base Targets: Development and Validation of a Cost Function and Semantic At-las.

    PubMed

    Aghdasi, Nava; Whipple, Mark; Humphreys, Ian M; Moe, Kris S; Hannaford, Blake; Bly, Randall A

    2018-06-01

    Successful multidisciplinary treatment of skull base pathology requires precise preoperative planning. Current surgical approach (pathway) selection for these complex procedures depends on an individual surgeon's experiences and background training. Because of anatomical variation in both normal tissue and pathology (eg, tumor), a successful surgical pathway used on one patient is not necessarily the best approach on another patient. The question is how to define and obtain optimized patient-specific surgical approach pathways? In this article, we demonstrate that the surgeon's knowledge and decision making in preoperative planning can be modeled by a multiobjective cost function in a retrospective analysis of actual complex skull base cases. Two different approaches- weighted-sum approach and Pareto optimality-were used with a defined cost function to derive optimized surgical pathways based on preoperative computed tomography (CT) scans and manually designated pathology. With the first method, surgeon's preferences were input as a set of weights for each objective before the search. In the second approach, the surgeon's preferences were used to select a surgical pathway from the computed Pareto optimal set. Using preoperative CT and magnetic resonance imaging, the patient-specific surgical pathways derived by these methods were similar (85% agreement) to the actual approaches performed on patients. In one case where the actual surgical approach was different, revision surgery was required and was performed utilizing the computationally derived approach pathway.

  16. Application of the critical pathway and integrated case teaching method to nursing orientation.

    PubMed

    Goodman, D

    1997-01-01

    Nursing staff development programs must be responsive to current changes in healthcare. New nursing staff must be prepared to manage continuous change and to function competently in clinical practice. The orientation pathway, based on a case management model, is used as a structure for the orientation phase of staff development. The integrated case is incorporated as a teaching strategy in orientation. The integrated case method is based on discussion and analysis of patient situations with emphasis on role modeling and integration of theory and skill. The orientation pathway and integrated case teaching method provide a useful framework for orientation of new staff. Educators, preceptors and orientees find the structure provided by the orientation pathway very useful. Orientation that is developed, implemented and evaluated based on a case management model with the use of an orientation pathway and incorporation of an integrated case teaching method provides a standardized structure for orientation of new staff. This approach is designed for the adult learner, promotes conceptual reasoning, and encourages the social and contextual basis for continued learning.

  17. The preparedness level of final year medical students for an adequate medical approach to emergency cases: computer-based medical education in emergency medicine

    PubMed Central

    2014-01-01

    Background We aimed to observe the preparedness level of final year medical students in approaching emergencies by computer-based simulation training and evaluate the efficacy of the program. Methods A computer-based prototype simulation program (Lsim), designed by researchers from the medical education and computer science departments, was used to present virtual cases for medical learning. Fifty-four final year medical students from Ondokuz Mayis University School of Medicine attended an education program on June 20, 2012 and were trained with Lsim. Volunteer attendants completed a pre-test and post-test exam at the beginning and end of the course, respectively, on the same day. Results Twenty-nine of the 54 students who attended the course accepted to take the pre-test and post-test exams; 58.6% (n = 17) were female. In 10 emergency medical cases, an average of 3.9 correct medical approaches were performed in the pre-test and an average of 9.6 correct medical approaches were performed in the post-test (t = 17.18, P = 0.006). Conclusions This study’s results showed that the readiness level of students for an adequate medical approach to emergency cases was very low. Computer-based training could help in the adequate approach of students to various emergency cases. PMID:24386919

  18. On the equivalence of case-crossover and time series methods in environmental epidemiology.

    PubMed

    Lu, Yun; Zeger, Scott L

    2007-04-01

    The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.

  19. Adjusting case mix payment amounts for inaccurately reported comorbidity data.

    PubMed

    Sutherland, Jason M; Hamm, Jeremy; Hatcher, Jeff

    2010-03-01

    Case mix methods such as diagnosis related groups have become a basis of payment for inpatient hospitalizations in many countries. Specifying cost weight values for case mix system payment has important consequences; recent evidence suggests case mix cost weight inaccuracies influence the supply of some hospital-based services. To begin to address the question of case mix cost weight accuracy, this paper is motivated by the objective of improving the accuracy of cost weight values due to inaccurate or incomplete comorbidity data. The methods are suitable to case mix methods that incorporate disease severity or comorbidity adjustments. The methods are based on the availability of detailed clinical and cost information linked at the patient level and leverage recent results from clinical data audits. A Bayesian framework is used to synthesize clinical data audit information regarding misclassification probabilities into cost weight value calculations. The models are implemented through Markov chain Monte Carlo methods. An example used to demonstrate the methods finds that inaccurate comorbidity data affects cost weight values by biasing cost weight values (and payments) downward. The implications for hospital payments are discussed and the generalizability of the approach is explored.

  20. Unilateral subfrontal approach to anterior communicating artery aneurysms: A review of 28 patients

    PubMed Central

    Petraglia, Anthony L.; Srinivasan, Vasisht; Moravan, Michael J.; Coriddi, Michelle; Jahromi, Babak S.; Vates, G Edward; Maurer, Paul K.

    2011-01-01

    Background: The pterional approach is the most common for AComm aneurysms, but we present a unilateral approach to a midline region for addressing the AComm complex. The pure subfrontal approach eliminates the lateral anatomic dissection requirements without sacrificing exposure. The subfrontal approach is not favored in the US compared to Asia and Europe. We describe our experience with the subfrontal approach for AComm aneurysms treated at a single institution. Methods: We identified 28 patients treated for AComm aneurysms through the subfrontal approach. Patient records and imaging studies were reviewed. Demographics and case data, as well as clinical outcome at 6 weeks and 1 year were collected. Results: Mean patient age was 48 (range 21–75) years and 64% suffered subarachnoid hemorrhage (SAH). All aneurysms were successfully clipped. Gyrus rectus was resected in 57% of cases, more commonly in ruptured cases. Intraoperative rupture occurred in 11% of cases. The average operative time was 171 minutes. There were two patient deaths. Ninety-two percent of patients had a Glasgow Outcome Scale (GOS) of 5 at 6 weeks. All unruptured patients had a GOS of 5. At 12 months, 96% of all patients had a GOS of 5. Conclusions: The subfrontal approach provides an efficient avenue to the AComm region, which reduces opening and closing friction but still yields a comprehensive operative window for access to the anterior communicating region. PMID:22059119

  1. The impact of case specificity and generalisable skills on clinical performance: a correlated traits-correlated methods approach.

    PubMed

    Wimmers, Paul F; Fung, Cha-Chi

    2008-06-01

    The finding of case or content specificity in medical problem solving moved the focus of research away from generalisable skills towards the importance of content knowledge. However, controversy about the content dependency of clinical performance and the generalisability of skills remains. This study aimed to explore the relative impact of both perspectives (case specificity and generalisable skills) on different components (history taking, physical examination, communication) of clinical performance within and across cases. Data from a clinical performance examination (CPX) taken by 350 Year 3 students were used in a correlated traits-correlated methods (CTCM) approach using confirmatory factor analysis, whereby 'traits' refers to generalisable skills and 'methods' to individual cases. The baseline CTCM model was analysed and compared with four nested models using structural equation modelling techniques. The CPX consisted of three skills components and five cases. Comparison of the four different models with the least-restricted baseline CTCM model revealed that a model with uncorrelated generalisable skills factors and correlated case-specific knowledge factors represented the data best. The generalisable processes found in history taking, physical examination and communication were responsible for half the explained variance, in comparison with the variance related to case specificity. Conclusions Pure knowledge-based and pure skill-based perspectives on clinical performance both seem too one-dimensional and new evidence supports the idea that a substantial amount of variance contributes to both aspects of performance. It could be concluded that generalisable skills and specialised knowledge go hand in hand: both are essential aspects of clinical performance.

  2. [Microsurgical removal of olfactory groove meningiomas].

    PubMed

    Liang, Ri-Sheng; Zhou, Liang-Fu; Mao, Ying; Zhang, Rong; Yang, Wei-Zhong

    2011-01-01

    To explore an effective method for further improving the surgical results of treatment of olfactory groove meningiomas. Sixty seven cases of olfactory groove meningiomas were treated by microneurosurgery, among which fifty seven were de novo cases, eight were recurrent tumors and the other two re-recurrent cases. Modified Derome approach was used in 12 cases, bilateral subfrontal approach in 28 cases, modified pterional approach in 21 cases and unilateral subfrontal approach in six cases. Tumors were resected microsurgically with radical removal of invaded dura, bone, and paranasal sinus mucosa. Reconstruction was performed in patients with skull base defect. Simpson grade I removal was accomplished in 59 cases, grade II in seven cases and grade IV in one case. Among 57 patients with de novo tumor, Simpson I resection was accomplished in 54 cases. Postoperative rhinorrhea and intracranial infection occurred in one case and was cured after temporal lumbar CSF drainage and antibiotic therapy. Two patients (2.9%) died within one month after operation, i.e.one aged patient of heart failure and the other of severe hypothalamus complication. Forty seven patients (72.3%) were followed up from one to ten years with an average of five years and four months. With the exception of two cases died, among the alive 45 patients, there were only three patients with tumor recurrence, which had undergone Simpson II or IV tumor resection. No recurrence was found in cases with Simpson I tumor removal. Previous blurred vision was not improved in three patients, hemiparalysis in two patients, and the other patients recovered well, resuming previous jobs or being able to take care themselves. Total tumor removal (Simpson I) should be the surgical goal for treatment of olfactory groove meningiomas, especially for de novo cases. An appropriate approach is fundamental in the effort to remove an OGM totally. Appropriate anterior skull base reconstruction with vascularized material is important and mandatory.

  3. Strategies for Research Development in Hospital Social Work: A Case Study

    ERIC Educational Resources Information Center

    McNeill, Ted; Nicholas, David Bruce

    2012-01-01

    Objectives: This article identifies salient components in the advancement of social work research leadership within health care. Method: Using tenets of a modified retrospective case study approach, processes and outcomes of social work research progression at a pediatric hospital are reviewed. Results: Capacity-building processes were…

  4. Routines in School Organizations: Creating Stability and Change

    ERIC Educational Resources Information Center

    Conley, Sharon; Enomoto, Ernestine K.

    2005-01-01

    Purpose: This paper presents routinized action theory as a way to examine the regular, habitual activities that occur in school organizations. Using this theoretical lens, school routines were analyzed in order to understand organizational stability and change. Design/methodology/approach: Using case study methods, three discrete cases are…

  5. Ethics: A Bridge for Studying the Social Contexts of Professional Communication.

    ERIC Educational Resources Information Center

    Speck, Bruce W.

    1989-01-01

    Describes a method for helping students evaluate ethical issues in a systematic way, based on Lawrence Kohlberg's stages of moral development. Recommends the case-study approach for creating social constructs in which students face ethical dilemmas, and outlines a case-study ethics unit using Kohlberg's model. (MM)

  6. Leyla and Mahmood--Emotions in Social Science Education

    ERIC Educational Resources Information Center

    Blennow, Katarina

    2018-01-01

    Purpose: The paper explores what emotions do in social science education through two specific cases and discusses the relation between emotion and politicization in the subject education. Method/approach: The cases are selected from an on-going dissertation project that uses interviews, video and observations in examining how social science…

  7. Attributing Responsibility for Child Maltreatment when Domestic Violence Is Present

    ERIC Educational Resources Information Center

    Landsman, Miriam J.; Hartley, Carolyn Copps

    2007-01-01

    Objective: The purpose of this study was to examine factors that influence how child welfare workers attribute responsibility for child maltreatment and child safety in cases involving domestic violence. Methods: The study used a factorial survey approach, combining elements of survey research with an experimental design. Case vignettes were…

  8. The Palatal Approach to Distraction Osteogenesis of the Anterior Maxillary Alveolus.

    PubMed

    Bell, Robert E

    2015-07-01

    This report describes the palatal approach to gain access for osteodistraction of the anterior maxilla to improve the vector of force during distraction. This case report illustrates a novel approach to anterior maxillary osteodistraction. The palatal approach allows the maxillary segment to be moved anteriorly and inferiorly. This is in contrast to the buccal approach, in which the palatal tissue creates a vector of force toward the palate. The vascular pedicle for the transport segment is the labial mucosa and musculature. In the present case, the alveolar segment was advanced 3.6 mm anteriorly and 12.2 mm inferiorly as measured by pre- and postoperative computed tomograms. This patient with a large vertical alveolar defect and high smile line had successful restoration with dental implants. The result has been stable for 14 months. In this case, the palatal approach to the anterior maxillary osteotomy was shown to be an effective method of reconstructing a large vertical anterior defect. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  9. Implementation of a finite-amplitude method in a relativistic meson-exchange model

    NASA Astrophysics Data System (ADS)

    Sun, Xuwei; Lu, Dinghui

    2017-08-01

    The finite-amplitude method is a feasible numerical approach to large scale random phase approximation calculations. It avoids the storage and calculation of residual interaction elements as well as the diagonalization of the RPA matrix, which will be prohibitive when the configuration space is huge. In this work we finished the implementation of a finite-amplitude method in a relativistic meson exchange mean field model with axial symmetry. The direct variation approach makes our FAM scheme capable of being extended to the multipole excitation case.

  10. Pedagogical Approaches to Develop Critical Thinking and Crisis Leadership

    ERIC Educational Resources Information Center

    Powley, Edward H.; Taylor, Scott N.

    2014-01-01

    Management schools must be prepared to aid leaders and managers to succeed in uncertain environments. We offer two approaches, each designed for critical thinking skill development, to teach graduate management students about leading in and through potential disruption to organizational life. First, we present a personalized case method that…

  11. An abstraction layer for efficient memory management of tabulated chemistry and flamelet solutions

    NASA Astrophysics Data System (ADS)

    Weise, Steffen; Messig, Danny; Meyer, Bernd; Hasse, Christian

    2013-06-01

    A large number of methods for simulating reactive flows exist, some of them, for example, directly use detailed chemical kinetics or use precomputed and tabulated flame solutions. Both approaches couple the research fields computational fluid dynamics and chemistry tightly together using either an online or offline approach to solve the chemistry domain. The offline approach usually involves a method of generating databases or so-called Lookup-Tables (LUTs). As these LUTs are extended to not only contain material properties but interactions between chemistry and turbulent flow, the number of parameters and thus dimensions increases. Given a reasonable discretisation, file sizes can increase drastically. The main goal of this work is to provide methods that handle large database files efficiently. A Memory Abstraction Layer (MAL) has been developed that handles requested LUT entries efficiently by splitting the database file into several smaller blocks. It keeps the total memory usage at a minimum using thin allocation methods and compression to minimise filesystem operations. The MAL has been evaluated using three different test cases. The first rather generic one is a sequential reading operation on an LUT to evaluate the runtime behaviour as well as the memory consumption of the MAL. The second test case is a simulation of a non-premixed turbulent flame, the so-called HM1 flame, which is a well-known test case in the turbulent combustion community. The third test case is a simulation of a non-premixed laminar flame as described by McEnally in 1996 and Bennett in 2000. Using the previously developed solver 'flameletFoam' in conjunction with the MAL, memory consumption and the performance penalty introduced were studied. The total memory used while running a parallel simulation was reduced significantly while the CPU time overhead associated with the MAL remained low.

  12. Semiparametric time varying coefficient model for matched case-crossover studies.

    PubMed

    Ortega-Villa, Ana Maria; Kim, Inyoung; Kim, H

    2017-03-15

    In matched case-crossover studies, it is generally accepted that the covariates on which a case and associated controls are matched cannot exert a confounding effect on independent predictors included in the conditional logistic regression model. This is because any stratum effect is removed by the conditioning on the fixed number of sets of the case and controls in the stratum. Hence, the conditional logistic regression model is not able to detect any effects associated with the matching covariates by stratum. However, some matching covariates such as time often play an important role as an effect modification leading to incorrect statistical estimation and prediction. Therefore, we propose three approaches to evaluate effect modification by time. The first is a parametric approach, the second is a semiparametric penalized approach, and the third is a semiparametric Bayesian approach. Our parametric approach is a two-stage method, which uses conditional logistic regression in the first stage and then estimates polynomial regression in the second stage. Our semiparametric penalized and Bayesian approaches are one-stage approaches developed by using regression splines. Our semiparametric one stage approach allows us to not only detect the parametric relationship between the predictor and binary outcomes, but also evaluate nonparametric relationships between the predictor and time. We demonstrate the advantage of our semiparametric one-stage approaches using both a simulation study and an epidemiological example of a 1-4 bi-directional case-crossover study of childhood aseptic meningitis with drinking water turbidity. We also provide statistical inference for the semiparametric Bayesian approach using Bayes Factors. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Correcting for batch effects in case-control microbiome studies

    PubMed Central

    Gibbons, Sean M.; Duvallet, Claire

    2018-01-01

    High-throughput data generation platforms, like mass-spectrometry, microarrays, and second-generation sequencing are susceptible to batch effects due to run-to-run variation in reagents, equipment, protocols, or personnel. Currently, batch correction methods are not commonly applied to microbiome sequencing datasets. In this paper, we compare different batch-correction methods applied to microbiome case-control studies. We introduce a model-free normalization procedure where features (i.e. bacterial taxa) in case samples are converted to percentiles of the equivalent features in control samples within a study prior to pooling data across studies. We look at how this percentile-normalization method compares to traditional meta-analysis methods for combining independent p-values and to limma and ComBat, widely used batch-correction models developed for RNA microarray data. Overall, we show that percentile-normalization is a simple, non-parametric approach for correcting batch effects and improving sensitivity in case-control meta-analyses. PMID:29684016

  14. Application of electrical geophysics to the release of water resources, case of Ain Leuh (Morocco)

    NASA Astrophysics Data System (ADS)

    Zitouni, A.; Boukdir, A.; El Fjiji, H.; Baite, W.; Ekouele Mbaki, V. R.; Ben Said, H.; Echakraoui, Z.; Elissami, A.; El Maslouhi, M. R.

    2018-05-01

    Being seen needs in increasing waters in our contry for fine domestics, manufactures and agricultural, the prospecting of subterranean waters by geologic and hydrogeologic classic method remains inaplicable in the cases of the regions where one does not arrange drillings or polls (soundings) of gratitude (recongnition) in very sufficient (self-important) number. In that case of figure, the method of prospecting geophysics such as the method of nuclear magnetic resonance (NMR) and the method of the geophysics radar are usually used most usually because they showed, worldwide, results very desive in the projects of prospecting and evaluation of the resources in subterranean waters. In the present work, which concerns only the methodology of the electric resistivity, we treat the adopted methodological approach and the study of the case of application in the tray of Ajdir Ain Leuh.

  15. On the precision of quasi steady state assumptions in stochastic dynamics

    NASA Astrophysics Data System (ADS)

    Agarwal, Animesh; Adams, Rhys; Castellani, Gastone C.; Shouval, Harel Z.

    2012-07-01

    Many biochemical networks have complex multidimensional dynamics and there is a long history of methods that have been used for dimensionality reduction for such reaction networks. Usually a deterministic mass action approach is used; however, in small volumes, there are significant fluctuations from the mean which the mass action approach cannot capture. In such cases stochastic simulation methods should be used. In this paper, we evaluate the applicability of one such dimensionality reduction method, the quasi-steady state approximation (QSSA) [L. Menten and M. Michaelis, "Die kinetik der invertinwirkung," Biochem. Z 49, 333369 (1913)] for dimensionality reduction in case of stochastic dynamics. First, the applicability of QSSA approach is evaluated for a canonical system of enzyme reactions. Application of QSSA to such a reaction system in a deterministic setting leads to Michaelis-Menten reduced kinetics which can be used to derive the equilibrium concentrations of the reaction species. In the case of stochastic simulations, however, the steady state is characterized by fluctuations around the mean equilibrium concentration. Our analysis shows that a QSSA based approach for dimensionality reduction captures well the mean of the distribution as obtained from a full dimensional simulation but fails to accurately capture the distribution around that mean. Moreover, the QSSA approximation is not unique. We have then extended the analysis to a simple bistable biochemical network model proposed to account for the stability of synaptic efficacies; the substrate of learning and memory [J. E. Lisman, "A mechanism of memory storage insensitive to molecular turnover: A bistable autophosphorylating kinase," Proc. Natl. Acad. Sci. U.S.A. 82, 3055-3057 (1985)], 10.1073/pnas.82.9.3055. Our analysis shows that a QSSA based dimensionality reduction method results in errors as big as two orders of magnitude in predicting the residence times in the two stable states.

  16. An empirical method that separates irreversible stem radial growth from bark water content changes in trees: theory and case studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick

    Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. Here, we employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validatemore » the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. In conclusion, our novel method provides an improved understanding of the relative source–sink carbon dynamics of tree stems at a sub-daily time scale.« less

  17. An empirical method that separates irreversible stem radial growth from bark water content changes in trees: theory and case studies.

    PubMed

    Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick; Hölttä, Teemu; Choat, Brendan; Meir, Patrick; O'Grady, Anthony; Tissue, David; Zweifel, Roman; Sevanto, Sanna; Pfautsch, Sebastian

    2017-02-01

    Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. We employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validate the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. Our novel method provides an improved understanding of the relative source-sink carbon dynamics of tree stems at a sub-daily time scale. © 2016 The Authors Plant, Cell & Environment Published by John Wiley & Sons Ltd.

  18. An empirical method that separates irreversible stem radial growth from bark water content changes in trees: theory and case studies

    DOE PAGES

    Mencuccini, Maurizio; Salmon, Yann; Mitchell, Patrick; ...

    2017-11-12

    Substantial uncertainty surrounds our knowledge of tree stem growth, with some of the most basic questions, such as when stem radial growth occurs through the daily cycle, still unanswered. Here, we employed high-resolution point dendrometers, sap flow sensors, and developed theory and statistical approaches, to devise a novel method separating irreversible radial growth from elastic tension-driven and elastic osmotically driven changes in bark water content. We tested this method using data from five case study species. Experimental manipulations, namely a field irrigation experiment on Scots pine and a stem girdling experiment on red forest gum trees, were used to validatemore » the theory. Time courses of stem radial growth following irrigation and stem girdling were consistent with a-priori predictions. Patterns of stem radial growth varied across case studies, with growth occurring during the day and/or night, consistent with the available literature. Importantly, our approach provides a valuable alternative to existing methods, as it can be approximated by a simple empirical interpolation routine that derives irreversible radial growth using standard regression techniques. In conclusion, our novel method provides an improved understanding of the relative source–sink carbon dynamics of tree stems at a sub-daily time scale.« less

  19. Mismatch removal via coherent spatial relations

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Ma, Jiayi; Yang, Changcai; Tian, Jinwen

    2014-07-01

    We propose a method for removing mismatches from the given putative point correspondences in image pairs based on "coherent spatial relations." Under the Bayesian framework, we formulate our approach as a maximum likelihood problem and solve a coherent spatial relation between the putative point correspondences using an expectation-maximization (EM) algorithm. Our approach associates each point correspondence with a latent variable indicating it as being either an inlier or an outlier, and alternatively estimates the inlier set and recovers the coherent spatial relation. It can handle not only the case of image pairs with rigid motions but also the case of image pairs with nonrigid motions. To parameterize the coherent spatial relation, we choose two-view geometry and thin-plate spline as models for rigid and nonrigid cases, respectively. The mismatches could be successfully removed via the coherent spatial relations after the EM algorithm converges. The quantitative results on various experimental data demonstrate that our method outperforms many state-of-the-art methods, it is not affected by low initial correct match percentages, and is robust to most geometric transformations including a large viewing angle, image rotation, and affine transformation.

  20. Genotype-Based Association Mapping of Complex Diseases: Gene-Environment Interactions with Multiple Genetic Markers and Measurement Error in Environmental Exposures

    PubMed Central

    Lobach, Irvna; Fan, Ruzone; Carroll, Raymond T.

    2011-01-01

    With the advent of dense single nucleotide polymorphism genotyping, population-based association studies have become the major tools for identifying human disease genes and for fine gene mapping of complex traits. We develop a genotype-based approach for association analysis of case-control studies of gene-environment interactions in the case when environmental factors are measured with error and genotype data are available on multiple genetic markers. To directly use the observed genotype data, we propose two genotype-based models: genotype effect and additive effect models. Our approach offers several advantages. First, the proposed risk functions can directly incorporate the observed genotype data while modeling the linkage disequihbrium information in the regression coefficients, thus eliminating the need to infer haplotype phase. Compared with the haplotype-based approach, an estimating procedure based on the proposed methods can be much simpler and significantly faster. In addition, there is no potential risk due to haplotype phase estimation. Further, by fitting the proposed models, it is possible to analyze the risk alleles/variants of complex diseases, including their dominant or additive effects. To model measurement error, we adopt the pseudo-likelihood method by Lobach et al. [2008]. Performance of the proposed method is examined using simulation experiments. An application of our method is illustrated using a population-based case-control study of association between calcium intake with the risk of colorectal adenoma development. PMID:21031455

  1. Robust stochastic optimization for reservoir operation

    NASA Astrophysics Data System (ADS)

    Pan, Limeng; Housh, Mashor; Liu, Pan; Cai, Ximing; Chen, Xin

    2015-01-01

    Optimal reservoir operation under uncertainty is a challenging engineering problem. Application of classic stochastic optimization methods to large-scale problems is limited due to computational difficulty. Moreover, classic stochastic methods assume that the estimated distribution function or the sample inflow data accurately represents the true probability distribution, which may be invalid and the performance of the algorithms may be undermined. In this study, we introduce a robust optimization (RO) approach, Iterative Linear Decision Rule (ILDR), so as to provide a tractable approximation for a multiperiod hydropower generation problem. The proposed approach extends the existing LDR method by accommodating nonlinear objective functions. It also provides users with the flexibility of choosing the accuracy of ILDR approximations by assigning a desired number of piecewise linear segments to each uncertainty. The performance of the ILDR is compared with benchmark policies including the sampling stochastic dynamic programming (SSDP) policy derived from historical data. The ILDR solves both the single and multireservoir systems efficiently. The single reservoir case study results show that the RO method is as good as SSDP when implemented on the original historical inflows and it outperforms SSDP policy when tested on generated inflows with the same mean and covariance matrix as those in history. For the multireservoir case study, which considers water supply in addition to power generation, numerical results show that the proposed approach performs as well as in the single reservoir case study in terms of optimal value and distributional robustness.

  2. 3D/2D model-to-image registration by imitation learning for cardiac procedures.

    PubMed

    Toth, Daniel; Miao, Shun; Kurzendorfer, Tanja; Rinaldi, Christopher A; Liao, Rui; Mansi, Tommaso; Rhode, Kawal; Mountney, Peter

    2018-05-12

    In cardiac interventions, such as cardiac resynchronization therapy (CRT), image guidance can be enhanced by involving preoperative models. Multimodality 3D/2D registration for image guidance, however, remains a significant research challenge for fundamentally different image data, i.e., MR to X-ray. Registration methods must account for differences in intensity, contrast levels, resolution, dimensionality, field of view. Furthermore, same anatomical structures may not be visible in both modalities. Current approaches have focused on developing modality-specific solutions for individual clinical use cases, by introducing constraints, or identifying cross-modality information manually. Machine learning approaches have the potential to create more general registration platforms. However, training image to image methods would require large multimodal datasets and ground truth for each target application. This paper proposes a model-to-image registration approach instead, because it is common in image-guided interventions to create anatomical models for diagnosis, planning or guidance prior to procedures. An imitation learning-based method, trained on 702 datasets, is used to register preoperative models to intraoperative X-ray images. Accuracy is demonstrated on cardiac models and artificial X-rays generated from CTs. The registration error was [Formula: see text] on 1000 test cases, superior to that of manual ([Formula: see text]) and gradient-based ([Formula: see text]) registration. High robustness is shown in 19 clinical CRT cases. Besides the proposed methods feasibility in a clinical environment, evaluation has shown good accuracy and high robustness indicating that it could be applied in image-guided interventions.

  3. Prosthetic rehabilitation of oral submucous fibrosis patients: A systematic review of published case reports and case series

    PubMed Central

    Patil, Shankargouda; Sarode, Gargi S.; Bhandi, Shilpa; Awan, Kamran Habib; Ferrari, Marco

    2017-01-01

    Background Oral submucous fibrosis (OSF) is an insidious chronic condition characterized by restricted mouth opening. Prosthetic rehabilitation is challenging for OSF patients as obtaining a good impression requires adequate mouth opening. The aim of the present review is to systematically present the data from case reports published in the English-language literature. Method A comprehensive search of the literature databases (PubMed, Medline, SCOPUS, Web of Science and Google Scholar) along with the references of published articles on prosthetic rehabilitation in OSF patients published to date was conducted. Keywords included a combination of ‘Oral submucous fibrosis’, ‘prosthesis’, ‘dentures’ and/or ‘restricted mouth opening’. Citations from selected references and bibliographic linkages taken from similar cases were included in this review. The inclusion criteria selected for case reports on prosthetic rehabilitation in OSF patients, and cases of restricted mouth opening due to causes other than OSF were excluded from the study. Results A total of 21 cases were identified and analysed from 17 papers published in the English-language literature. Of these, 9 cases employed the sectional denture technique, 4 cases emphasized the need-based treatment approach in which conventional methods were modified, and 4 cases used mouth exercising devices. Finally, 1 case each involved, flexible denture, oral screen prosthesis, oral stents, surgery in conjunction with dentures. Conclusion Prosthetic rehabilitation in OSF patients is a multifaceted approach and should be patient specific, although sectional dentures have achieved the best results. PMID:28877246

  4. Isolating DNA from sexual assault cases: a comparison of standard methods with a nuclease-based approach

    PubMed Central

    2012-01-01

    Background Profiling sperm DNA present on vaginal swabs taken from rape victims often contributes to identifying and incarcerating rapists. Large amounts of the victim’s epithelial cells contaminate the sperm present on swabs, however, and complicate this process. The standard method for obtaining relatively pure sperm DNA from a vaginal swab is to digest the epithelial cells with Proteinase K in order to solubilize the victim’s DNA, and to then physically separate the soluble DNA from the intact sperm by pelleting the sperm, removing the victim’s fraction, and repeatedly washing the sperm pellet. An alternative approach that does not require washing steps is to digest with Proteinase K, pellet the sperm, remove the victim’s fraction, and then digest the residual victim’s DNA with a nuclease. Methods The nuclease approach has been commercialized in a product, the Erase Sperm Isolation Kit (PTC Labs, Columbia, MO, USA), and five crime laboratories have tested it on semen-spiked female buccal swabs in a direct comparison with their standard methods. Comparisons have also been performed on timed post-coital vaginal swabs and evidence collected from sexual assault cases. Results For the semen-spiked buccal swabs, Erase outperformed the standard methods in all five laboratories and in most cases was able to provide a clean male profile from buccal swabs spiked with only 1,500 sperm. The vaginal swabs taken after consensual sex and the evidence collected from rape victims showed a similar pattern of Erase providing superior profiles. Conclusions In all samples tested, STR profiles of the male DNA fractions obtained with Erase were as good as or better than those obtained using the standard methods. PMID:23211019

  5. Learning Biological Networks via Bootstrapping with Optimized GO-based Gene Similarity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.

    2010-08-02

    Microarray gene expression data provide a unique information resource for learning biological networks using "reverse engineering" methods. However, there are a variety of cases in which we know which genes are involved in a given pathology of interest, but we do not have enough experimental evidence to support the use of fully-supervised/reverse-engineering learning methods. In this paper, we explore a novel semi-supervised approach in which biological networks are learned from a reference list of genes and a partial set of links for these genes extracted automatically from PubMed abstracts, using a knowledge-driven bootstrapping algorithm. We show how new relevant linksmore » across genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. We describe an application of this approach to the TGFB pathway as a case study and show how the ensuing results prove the feasibility of the approach as an alternate or complementary technique to fully supervised methods.« less

  6. A conservative, thermodynamically consistent numerical approach for low Mach number combustion. Part I: Single-level integration

    NASA Astrophysics Data System (ADS)

    Nonaka, Andrew; Day, Marcus S.; Bell, John B.

    2018-01-01

    We present a numerical approach for low Mach number combustion that conserves both mass and energy while remaining on the equation of state to a desired tolerance. We present both unconfined and confined cases, where in the latter the ambient pressure changes over time. Our overall scheme is a projection method for the velocity coupled to a multi-implicit spectral deferred corrections (SDC) approach to integrate the mass and energy equations. The iterative nature of SDC methods allows us to incorporate a series of pressure discrepancy corrections naturally that lead to additional mass and energy influx/outflux in each finite volume cell in order to satisfy the equation of state. The method is second order, and satisfies the equation of state to a desired tolerance with increasing iterations. Motivated by experimental results, we test our algorithm on hydrogen flames with detailed kinetics. We examine the morphology of thermodiffusively unstable cylindrical premixed flames in high-pressure environments for confined and unconfined cases. We also demonstrate that our algorithm maintains the equation of state for premixed methane flames and non-premixed dimethyl ether jet flames.

  7. Ultrasound sounding in air by fast-moving receiver

    NASA Astrophysics Data System (ADS)

    Sukhanov, D.; Erzakova, N.

    2018-05-01

    A method of ultrasound imaging in the air for a fast receiver. The case, when the speed of movement of the receiver can not be neglected with respect to the speed of sound. In this case, the Doppler effect is significant, making it difficult for matched filtering of the backscattered signal. The proposed method does not use a continuous repetitive noise-sounding signal. generalized approach applies spatial matched filtering in the time domain to recover the ultrasonic tomographic images.

  8. An Approach to Assign Individual Marks from a Team Mark: The Case of Australian Grading System at Universities

    ERIC Educational Resources Information Center

    Nepal, Kali Prasad

    2012-01-01

    This study uses a new approach to assign individual marks from a team mark using individual contributions to a teamwork product. A team member's contribution to a teamwork product, in the form of an individual weighting factor, is calculated using team members' co-assessment. A comparison of the proposed approach with existing methods has been…

  9. Adapting Surface Ground Motion Relations to Underground conditions: A case study for the Sudbury Neutrino Observatory in Sudbury, Ontario, Canada

    NASA Astrophysics Data System (ADS)

    Babaie Mahani, A.; Eaton, D. W.

    2013-12-01

    Ground Motion Prediction Equations (GMPEs) are widely used in Probabilistic Seismic Hazard Assessment (PSHA) to estimate ground-motion amplitudes at Earth's surface as a function of magnitude and distance. Certain applications, such as hazard assessment for caprock integrity in the case of underground storage of CO2, waste disposal sites, and underground pipelines, require subsurface estimates of ground motion; at present, such estimates depend upon theoretical modeling and simulations. The objective of this study is to derive correction factors for GMPEs to enable estimation of amplitudes in the subsurface. We use a semi-analytic approach along with finite-difference simulations of ground-motion amplitudes for surface and underground motions. Spectral ratios of underground to surface motions are used to calculate the correction factors. Two predictive methods are used. The first is a semi-analytic approach based on a quarter-wavelength method that is widely used for earthquake site-response investigations; the second is a numerical approach based on elastic finite-difference simulations of wave propagation. Both methods are evaluated using recordings of regional earthquakes by broadband seismometers installed at the surface and at depths of 1400 m and 2100 m in the Sudbury Neutrino Observatory, Canada. Overall, both methods provide a reasonable fit to the peaks and troughs observed in the ratios of real data. The finite-difference method, however, has the capability to simulate ground motion ratios more accurately than the semi-analytic approach.

  10. Analysis of Arterial Blood Gas Report in Chronic Kidney Diseases - Comparison between Bedside and Multistep Systematic Method.

    PubMed

    Ghatak, Ishita; Dhat, Vaishali; Tilak, Mona A; Roy, Indranath

    2016-08-01

    Acid Base Disorders (ABDs) are commonly encountered in critically ill Chronic Kidney Disease (CKD) patients. Timely and correct analysis of Arterial Blood Gases (ABG) is critical for the diagnosis, treatment and prediction of outcome of the patients. The aim was to explore type and prevalence of ABDs in 31 critically ill CKD patients from a tertiary care hospital in Maharashtra, to compare two methods of analysis- bedside and systematic approaches and to clinically correlate the nature of ABDs in these patients. The initial ABG reports of 31 consecutive CKD patients were analysed by two methods. Medica Easy stat analyser was the equipment for analysis with Principle of potentiometry and ion selective electrode for pH and pCO2 and amperometry for pO2. Serum albumin was also measured by Bromocresol green dye binding method using liquixx albumin kit in Erba XL 300 autoanalyser. Chi-square test was used for statistical analysis using Epi Info version 3.5.4 and SPSS 14.0 softwares. The systematic method showed a significantly higher prevalence of mixed disorders (50%) compared to bedside method (12.9%). Most prevalent disorder by bedside method was metabolic acidosis in 15 cases (48.39%). By the systematic method, 3 reports were invalid. As a single category, most prevalent type was both simple respiratory alkalosis and mixed metabolic acidosis with respiratory alkalosis- 6 of 31 cases in each type (19.36% each). As a whole, metabolic acidosis (including both High Anion Gap Metabolic Acidosis or HAGMA and Non Anion Gap Metabolic Acidosis or NAGMA with 4 in each type) was most prevalent- 8 of 31(25.8%). Systematic approach was more effective in diagnosing mixed acid base disorders. By systematic method the findings of analysis in most cases could be correlated with the clinical condition and provisional diagnosis. Thus interpretation of ABDs by using stepwise approach could be useful to the clinicians in early diagnosis and management of the patients.

  11. Improving real-time efficiency of case-based reasoning for medical diagnosis.

    PubMed

    Park, Yoon-Joo

    2014-01-01

    Conventional case-based reasoning (CBR) does not perform efficiently for high volume dataset because of case-retrieval time. Some previous researches overcome this problem by clustering a case-base into several small groups, and retrieve neighbors within a corresponding group to a target case. However, this approach generally produces less accurate predictive performances than the conventional CBR. This paper suggests a new case-based reasoning method called the Clustering-Merging CBR (CM-CBR) which produces similar level of predictive performances than the conventional CBR with spending significantly less computational cost.

  12. Iterative approach as alternative to S-matrix in modal methods

    NASA Astrophysics Data System (ADS)

    Semenikhin, Igor; Zanuccoli, Mauro

    2014-12-01

    The continuously increasing complexity of opto-electronic devices and the rising demands of simulation accuracy lead to the need of solving very large systems of linear equations making iterative methods promising and attractive from the computational point of view with respect to direct methods. In particular, iterative approach potentially enables the reduction of required computational time to solve Maxwell's equations by Eigenmode Expansion algorithms. Regardless of the particular eigenmodes finding method used, the expansion coefficients are computed as a rule by scattering matrix (S-matrix) approach or similar techniques requiring order of M3 operations. In this work we consider alternatives to the S-matrix technique which are based on pure iterative or mixed direct-iterative approaches. The possibility to diminish the impact of M3 -order calculations to overall time and in some cases even to reduce the number of arithmetic operations to M2 by applying iterative techniques are discussed. Numerical results are illustrated to discuss validity and potentiality of the proposed approaches.

  13. Random Matrix Approach for Primal-Dual Portfolio Optimization Problems

    NASA Astrophysics Data System (ADS)

    Tada, Daichi; Yamamoto, Hisashi; Shinzato, Takashi

    2017-12-01

    In this paper, we revisit the portfolio optimization problems of the minimization/maximization of investment risk under constraints of budget and investment concentration (primal problem) and the maximization/minimization of investment concentration under constraints of budget and investment risk (dual problem) for the case that the variances of the return rates of the assets are identical. We analyze both optimization problems by the Lagrange multiplier method and the random matrix approach. Thereafter, we compare the results obtained from our proposed approach with the results obtained in previous work. Moreover, we use numerical experiments to validate the results obtained from the replica approach and the random matrix approach as methods for analyzing both the primal and dual portfolio optimization problems.

  14. Error measure comparison of currently employed dose-modulation schemes for e-beam proximity effect control

    NASA Astrophysics Data System (ADS)

    Peckerar, Martin C.; Marrian, Christie R.

    1995-05-01

    Standard matrix inversion methods of e-beam proximity correction are compared with a variety of pseudoinverse approaches based on gradient descent. It is shown that the gradient descent methods can be modified using 'regularizers' (terms added to the cost function minimized during gradient descent). This modification solves the 'negative dose' problem in a mathematically sound way. Different techniques are contrasted using a weighted error measure approach. It is shown that the regularization approach leads to the highest quality images. In some cases, ignoring negative doses yields results which are worse than employing an uncorrected dose file.

  15. A Simple and Robust Method for Partially Matched Samples Using the P-Values Pooling Approach

    PubMed Central

    Kuan, Pei Fen; Huang, Bo

    2013-01-01

    This paper focuses on statistical analyses in scenarios where some samples from the matched pairs design are missing, resulting in partially matched samples. Motivated by the idea of meta-analysis, we recast the partially matched samples as coming from two experimental designs, and propose a simple yet robust approach based on the weighted Z-test to integrate the p-values computed from these two designs. We show that the proposed approach achieves better operating characteristics in simulations and a case study, compared to existing methods for partially matched samples. PMID:23417968

  16. Qualitative research methods in renal medicine: an introduction.

    PubMed

    Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M

    2015-09-01

    Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  17. A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study.

    PubMed

    Kaplan, David; Chen, Jianshen

    2012-07-01

    A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for three methods of implementation: propensity score stratification, weighting, and optimal full matching. Three simulation studies and one case study are presented to elaborate the proposed two-step Bayesian propensity score approach. Results of the simulation studies reveal that greater precision in the propensity score equation yields better recovery of the frequentist-based treatment effect. A slight advantage is shown for the Bayesian approach in small samples. Results also reveal that greater precision around the wrong treatment effect can lead to seriously distorted results. However, greater precision around the correct treatment effect parameter yields quite good results, with slight improvement seen with greater precision in the propensity score equation. A comparison of coverage rates for the conventional frequentist approach and proposed Bayesian approach is also provided. The case study reveals that credible intervals are wider than frequentist confidence intervals when priors are non-informative.

  18. Construction of integrated case environments.

    PubMed

    Losavio, Francisca; Matteo, Alfredo; Pérez, María

    2003-01-01

    The main goal of Computer-Aided Software Engineering (CASE) technology is to improve the entire software system development process. The CASE approach is not merely a technology; it involves a fundamental change in the process of software development. The tendency of the CASE approach, technically speaking, is the integration of tools that assist in the application of specific methods. In this sense, the environment architecture, which includes the platform and the system's hardware and software, constitutes the base of the CASE environment. The problem of tools integration has been proposed for two decades. Current integration efforts emphasize the interoperability of tools, especially in distributed environments. In this work we use the Brown approach. The environment resulting from the application of this model is called a federative environment, focusing on the fact that this architecture pays special attention to the connections among the components of the environment. This approach is now being used in component-based design. This paper describes a concrete experience in civil engineering and architecture fields, for the construction of an integrated CASE environment. A generic architectural framework based on an intermediary architectural pattern is applied to achieve the integration of the different tools. This intermediary represents the control perspective of the PAC (Presentation-Abstraction-Control) style, which has been implemented as a Mediator pattern and it has been used in the interactive systems domain. In addition, a process is given to construct the integrated CASE.

  19. Audit method suited for DSS in clinical environment.

    PubMed

    Vicente, Javier

    2015-01-01

    This chapter presents a novel online method to audit predictive models using a Bayesian perspective. The auditing model has been specifically designed for Decision Support Systems (DSSs) suited for clinical or research environments. Taking as starting point the working diagnosis supplied by the clinician, this method compares and evaluates the predictive skills of those models able to answer to that diagnosis. The approach consists in calculating the posterior odds of a model through the composition of a prior odds, a static odds, and a dynamic odds. To do so, this method estimates the posterior odds from the cases that the comparing models had in common during the design stage and from the cases already viewed by the DSS after deployment in the clinical site. In addition, if an ontology of the classes is available, this method can audit models answering related questions, which offers a reinforcement to the decisions the user already took and gives orientation on further diagnostic steps.The main technical novelty of this approach lies in the design of an audit model adapted to suit the decision workflow of a clinical environment. The audit model allows deciding what is the classifier that best suits each particular case under evaluation and allows the detection of possible misbehaviours due to population differences or data shifts in the clinical site. We show the efficacy of our method for the problem of brain tumor diagnosis with Magnetic Resonance Spectroscopy (MRS).

  20. Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data

    PubMed Central

    Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil

    2014-01-01

    Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202

  1. A Case Study of Computer Gaming for Math: Engaged Learning from Gameplay?

    ERIC Educational Resources Information Center

    Ke, Fengfeng

    2008-01-01

    Employing mixed-method approach, this case study examined the in situ use of educational computer games in a summer math program to facilitate 4th and 5th graders' cognitive math achievement, metacognitive awareness, and positive attitudes toward math learning. The results indicated that students developed more positive attitudes toward math…

  2. A Case Study of Knowledge Management in the "Back Office" of Two English Football Clubs

    ERIC Educational Resources Information Center

    Doloriert, Clair; Whitworth, Kieran

    2011-01-01

    Purpose: This study aims to explore knowledge management (KM) practice in the "back office" of two English football clubs. Design/methodology/approach: The paper takes the form of a comparative case study of two medium-sized businesses using multi-method data including unstructured interviews, structured questionnaires and document…

  3. The "Russian Doll" Approach: Developing Nested Case-Studies to Support International Comparative Research in Education

    ERIC Educational Resources Information Center

    Chong, Pei Wen; Graham, Linda J.

    2013-01-01

    International comparison is complicated by the use of different terms, classification methods, policy frameworks and system structures, not to mention different languages and terminology. Multi-case studies can assist in the understanding of the influence wielded by cultural, social, economic, historical and political forces upon educational…

  4. Ensuring Healthy Youth Development through Community Schools: A Case Study

    ERIC Educational Resources Information Center

    Anderson-Butcher, Dawn; Paluta, Lauren; Sterling, Karen; Anderson, Carol

    2018-01-01

    Using mixed methods, this case study explored outcomes associated with the adoption and implementation of a community schools approach in four Title I schools using the Community Collaboration Model for School Improvement. Trends in school data demonstrate academic achievement improvements in three of the four schools. Absenteeism and the number…

  5. Cognitive Dissonance, Supervision, and Administrative Team Conflict

    ERIC Educational Resources Information Center

    Zepeda, Sally J.

    2006-01-01

    Purpose: The purpose of this paper is to record and summarize the tensions and problems experienced by a high school administrative team as they attempted to change supervision alongside instruction in a transition to a new block schedule. Design/methodology/approach: A case study method was used. As a case study, the research is contextual in…

  6. Triggering Transformative Possibilities: A Case Study of Leaders' Quest in China

    ERIC Educational Resources Information Center

    Lau-Kwong, Kenzie

    2012-01-01

    This study explored the nature of transformative learning experiences among global executives who participated in Quest program, a learning journey program designed to facilitate shifting mind-sets and worldviews through 1-week intensives in countries such as China. A mixed methods, multiple case study approach was employed. First, a secondary…

  7. Adult Education in Development. Methods and Approaches from Changing Societies.

    ERIC Educational Resources Information Center

    McGivney, Veronica; Murray, Frances

    The case studies described in this book provide examples of initiatives illustrating the role of adult education in development and its contribution to the process of change in developing countries. The book is organized in five sections. Case studies in Part 1, "Health Education," illustrate the links between primary health care and…

  8. Cases on Research-Based Teaching Methods in Science Education

    ERIC Educational Resources Information Center

    de Silva, Eugene, Ed.

    2015-01-01

    While the great scientists of the past recognized a need for a multidisciplinary approach, today's schools often treat math and science as subjects separate from the rest. This not only creates a disinterest among students, but also a potential learning gap once students reach college and then graduate into the workforce. "Cases on…

  9. Knowledge Management Model: Practical Application for Competency Development

    ERIC Educational Resources Information Center

    Lustri, Denise; Miura, Irene; Takahashi, Sergio

    2007-01-01

    Purpose: This paper seeks to present a knowledge management (KM) conceptual model for competency development and a case study in a law service firm, which implemented the KM model in a competencies development program. Design/methodology/approach: The case study method was applied according to Yin (2003) concepts, focusing a six-professional group…

  10. On a new iterative method for solving linear systems and comparison results

    NASA Astrophysics Data System (ADS)

    Jing, Yan-Fei; Huang, Ting-Zhu

    2008-10-01

    In Ujevic [A new iterative method for solving linear systems, Appl. Math. Comput. 179 (2006) 725-730], the author obtained a new iterative method for solving linear systems, which can be considered as a modification of the Gauss-Seidel method. In this paper, we show that this is a special case from a point of view of projection techniques. And a different approach is established, which is both theoretically and numerically proven to be better than (at least the same as) Ujevic's. As the presented numerical examples show, in most cases, the convergence rate is more than one and a half that of Ujevic.

  11. Efficient Relaxation of Protein-Protein Interfaces by Discrete Molecular Dynamics Simulations.

    PubMed

    Emperador, Agusti; Solernou, Albert; Sfriso, Pedro; Pons, Carles; Gelpi, Josep Lluis; Fernandez-Recio, Juan; Orozco, Modesto

    2013-02-12

    Protein-protein interactions are responsible for the transfer of information inside the cell and represent one of the most interesting research fields in structural biology. Unfortunately, after decades of intense research, experimental approaches still have difficulties in providing 3D structures for the hundreds of thousands of interactions formed between the different proteins in a living organism. The use of theoretical approaches like docking aims to complement experimental efforts to represent the structure of the protein interactome. However, we cannot ignore that current methods have limitations due to problems of sampling of the protein-protein conformational space and the lack of accuracy of available force fields. Cases that are especially difficult for prediction are those in which complex formation implies a non-negligible change in the conformation of the interacting proteins, i.e., those cases where protein flexibility plays a key role in protein-protein docking. In this work, we present a new approach to treat flexibility in docking by global structural relaxation based on ultrafast discrete molecular dynamics. On a standard benchmark of protein complexes, the method provides a general improvement over the results obtained by rigid docking. The method is especially efficient in cases with large conformational changes upon binding, in which structure relaxation with discrete molecular dynamics leads to a predictive success rate double that obtained with state-of-the-art rigid-body docking.

  12. [Continuing importance of the clinical approach. Observations on a regional collaboration between general practitioners, internists and cardiologists].

    PubMed

    Jaussi, A

    1994-11-12

    The advent of high-tech diagnostic methods raises concerns nowadays about the value of the clinical approach and bedside diagnosis. This at least is the impression given by modern scientific literature, which rarely even mentions this part of examination of the patient. In order to define the actual role played by auscultation in the management of cardiological patients by the primary care physician, the records of 250 patients consecutively referred to a cardiologist are analyzed. The practitioner's initial clinical diagnosis is compared to the final cardiological diagnosis. Per referred patient, 1.76 specialized consultations were needed. In 64% of the cases only one such consultation took place. Initial diagnosis was correct in 80% of all cases, partly correct in 11% and incorrect in 9% of the cases. Out of the 64 cases of valvular diseases, 33 were initially correctly recognized by the physician. The cardiological investigation was also invasive in 6.5% of all cases, 4.5% of the patients eventually undergoing invasive or surgical treatment. Thus the great majority of the patients (93.5%) were managed by the primary physician with "first-line" cardiologist's support, which was often only occasional (only one specialized consultation in about two thirds of all cases). This highly independent and presumably cost-effective patient management by the primary care physician implies a high level of clinical skill. It stresses the outstanding importance of continuing teaching of the clinical approach and particularly of cardiac auscultation, which is still the best screening method for valvular heart disease.

  13. A modified artificial immune system based pattern recognition approach -- an application to clinic diagnostics

    PubMed Central

    Zhao, Weixiang; Davis, Cristina E.

    2011-01-01

    Objective This paper introduces a modified artificial immune system (AIS)-based pattern recognition method to enhance the recognition ability of the existing conventional AIS-based classification approach and demonstrates the superiority of the proposed new AIS-based method via two case studies of breast cancer diagnosis. Methods and materials Conventionally, the AIS approach is often coupled with the k nearest neighbor (k-NN) algorithm to form a classification method called AIS-kNN. In this paper we discuss the basic principle and possible problems of this conventional approach, and propose a new approach where AIS is integrated with the radial basis function – partial least square regression (AIS-RBFPLS). Additionally, both the two AIS-based approaches are compared with two classical and powerful machine learning methods, back-propagation neural network (BPNN) and orthogonal radial basis function network (Ortho-RBF network). Results The diagnosis results show that: (1) both the AIS-kNN and the AIS-RBFPLS proved to be a good machine leaning method for clinical diagnosis, but the proposed AIS-RBFPLS generated an even lower misclassification ratio, especially in the cases where the conventional AIS-kNN approach generated poor classification results because of possible improper AIS parameters. For example, based upon the AIS memory cells of “replacement threshold = 0.3”, the average misclassification ratios of two approaches for study 1 are 3.36% (AIS-RBFPLS) and 9.07% (AIS-kNN), and the misclassification ratios for study 2 are 19.18% (AIS-RBFPLS) and 28.36% (AIS-kNN); (2) the proposed AIS-RBFPLS presented its robustness in terms of the AIS-created memory cells, showing a smaller standard deviation of the results from the multiple trials than AIS-kNN. For example, using the result from the first set of AIS memory cells as an example, the standard deviations of the misclassification ratios for study 1 are 0.45% (AIS-RBFPLS) and 8.71% (AIS-kNN) and those for study 2 are 0.49% (AIS-RBFPLS) and 6.61% (AIS-kNN); and (3) the proposed AIS-RBFPLS classification approaches also yielded better diagnosis results than two classical neural network approaches of BPNN and Ortho-RBF network. Conclusion In summary, this paper proposed a new machine learning method for complex systems by integrating the AIS system with RBFPLS. This new method demonstrates its satisfactory effect on classification accuracy for clinical diagnosis, and also indicates its wide potential applications to other diagnosis and detection problems. PMID:21515033

  14. Dynamic Target Definition: a novel approach for PTV definition in ion beam therapy.

    PubMed

    Cabal, Gonzalo A; Jäkel, Oliver

    2013-05-01

    To present a beam arrangement specific approach for PTV definition in ion beam therapy. By means of a Monte Carlo error propagation analysis a criteria is formulated to assess whether a voxel is safely treated. Based on this a non-isotropical expansion rule is proposed aiming to minimize the impact of uncertainties on the dose delivered. The method is exemplified in two cases: a Head and Neck case and a Prostate case. In both cases the modality used is proton beam irradiation and the sources of uncertainties taken into account are positioning (set up) errors and range uncertainties. It is shown how different beam arrangements have an impact on plan robustness which leads to different target expansions necessary to assure a predefined level of plan robustness. The relevance of appropriate beam angle arrangements as a way to minimize uncertainties is demonstrated. A novel method for PTV definition in on beam therapy is presented. The method show promising results by improving the probability of correct dose CTV coverage while reducing the size of the PTV volume. In a clinical scenario this translates into an enhanced tumor control probability while reducing the volume of healthy tissue being irradiated. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Reflections on Practical Approaches to Involving Children and Young People in the Data Analysis Process

    ERIC Educational Resources Information Center

    Coad, Jane; Evans, Ruth

    2008-01-01

    This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…

  16. The Effect of Suspension as a Deterrent to Student Misconduct

    ERIC Educational Resources Information Center

    Jackson, Angela Coleman

    2014-01-01

    The purpose of this study was to examine the effect of suspension as a deterrent to student misconduct. A mixed methods approach using both qualitative (interviews of administrators and teachers) and quantitative (discipline records of identified sixth graders) were utilized. In this case study approach, one-on-one semi-structured interviews were…

  17. A new method for extracting near-surface mass-density anomalies from land-based gravity data, based on a special case of Poisson's PDE at the Earth's surface: A case study of salt diapirs in the south of Iran

    NASA Astrophysics Data System (ADS)

    AllahTavakoli, Y.; Safari, A.; Ardalan, A.; Bahroudi, A.

    2015-12-01

    The current research provides a method for tracking near-surface mass-density anomalies via using only land-based gravity data, which is based on a special version of Poisson's Partial Differential Equation (PDE) of the gravitational field at Earth's surface. The research demonstrates how the Poisson's PDE can provide us with a capability to extract the near-surface mass-density anomalies from land-based gravity data. Herein, this version of the Poisson's PDE is mathematically introduced to the Earth's surface and then it is used to develop the new method for approximating the mass-density via derivatives of the Earth's gravitational field (i.e. via the gradient tensor). Herein, the author believes that the PDE can give us new knowledge about the behavior of the Earth's gravitational field at the Earth's surface which can be so useful for developing new methods of Earth's mass-density determination. In a case study, the proposed method is applied to a set of gravity stations located in the south of Iran. The results were numerically validated via certain knowledge about the geological structures in the area of the case study. Also, the method was compared with two standard methods of mass-density determination. All the numerical experiments show that the proposed approach is well-suited for tracking near-surface mass-density anomalies via using only the gravity data. Finally, the approach is also applied to some petroleum exploration studies of salt diapirs in the south of Iran.

  18. A modelling approach to assessing the timescale uncertainties in proxy series with chronological errors

    NASA Astrophysics Data System (ADS)

    Divine, D. V.; Godtliebsen, F.; Rue, H.

    2012-01-01

    The paper proposes an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models based on tie points of mixed origin.

  19. Semantic divergence in clinical education: Student-centered or student democracy

    PubMed Central

    Khademolhosseini, Seyyed Mohammad; Vanaki, Zohreh; Memarian, Robabeh; Ebadi, Abass

    2012-01-01

    Aims: Although several studies have confirmed the validity and the strength of the student-centered approach and most training centeres have put it as the heading of their agenda, there are still problems in the method of implementation, increasing the need for further research to review the mode of implementation. In this regard, the present study has been conducted to investigate students’ and educators’ perception in terms of interaction manner in clinical education process. Settings and Design: This study was performed in a qualitative approach and by the conventional content analysis method. Materials and Methods: Data were collected until saturation through use of individual semi-structured interviews. Twenty-one subjects including undergraduate nursing students (8 cases), faculty member educators (9 cases), head nurses (3 cases), and educational supervisor (1 case) participated in the study, and the data were analyzed using MAXQDA3 software. Results: “tudent democrac” was extracted through data analysis as the main theme of the study. Participants’ experience in terms of the five sub-themes included instructor’s loss of dignity, negligence in the evaluation of the students, poor discipline, lack of compliance with the educator, and lack of motivation. Conclusions: Instructor’s weaknesses in planning, guiding, and evaluating the students led to student’s interference in these affairs and a challenge in effective student-centered approach. Although excessive emphasis on students’ opinion for educational evaluation is apparently a sign of tribute to the students, it ultimately contributes to ignoring the process of learning to attract students’ interest, occupational devaluation, and a decrease in students’ motivation. PMID:23922594

  20. Paradigms for machine learning

    NASA Technical Reports Server (NTRS)

    Schlimmer, Jeffrey C.; Langley, Pat

    1991-01-01

    Five paradigms are described for machine learning: connectionist (neural network) methods, genetic algorithms and classifier systems, empirical methods for inducing rules and decision trees, analytic learning methods, and case-based approaches. Some dimensions are considered along with these paradigms vary in their approach to learning, and the basic methods are reviewed that are used within each framework, together with open research issues. It is argued that the similarities among the paradigms are more important than their differences, and that future work should attempt to bridge the existing boundaries. Finally, some recent developments in the field of machine learning are discussed, and their impact on both research and applications is examined.

  1. An Iterative Approach for the Optimization of Pavement Maintenance Management at the Network Level

    PubMed Central

    Torres-Machí, Cristina; Chamorro, Alondra; Videla, Carlos; Yepes, Víctor

    2014-01-01

    Pavement maintenance is one of the major issues of public agencies. Insufficient investment or inefficient maintenance strategies lead to high economic expenses in the long term. Under budgetary restrictions, the optimal allocation of resources becomes a crucial aspect. Two traditional approaches (sequential and holistic) and four classes of optimization methods (selection based on ranking, mathematical optimization, near optimization, and other methods) have been applied to solve this problem. They vary in the number of alternatives considered and how the selection process is performed. Therefore, a previous understanding of the problem is mandatory to identify the most suitable approach and method for a particular network. This study aims to assist highway agencies, researchers, and practitioners on when and how to apply available methods based on a comparative analysis of the current state of the practice. Holistic approach tackles the problem considering the overall network condition, while the sequential approach is easier to implement and understand, but may lead to solutions far from optimal. Scenarios defining the suitability of these approaches are defined. Finally, an iterative approach gathering the advantages of traditional approaches is proposed and applied in a case study. The proposed approach considers the overall network condition in a simpler and more intuitive manner than the holistic approach. PMID:24741352

  2. An iterative approach for the optimization of pavement maintenance management at the network level.

    PubMed

    Torres-Machí, Cristina; Chamorro, Alondra; Videla, Carlos; Pellicer, Eugenio; Yepes, Víctor

    2014-01-01

    Pavement maintenance is one of the major issues of public agencies. Insufficient investment or inefficient maintenance strategies lead to high economic expenses in the long term. Under budgetary restrictions, the optimal allocation of resources becomes a crucial aspect. Two traditional approaches (sequential and holistic) and four classes of optimization methods (selection based on ranking, mathematical optimization, near optimization, and other methods) have been applied to solve this problem. They vary in the number of alternatives considered and how the selection process is performed. Therefore, a previous understanding of the problem is mandatory to identify the most suitable approach and method for a particular network. This study aims to assist highway agencies, researchers, and practitioners on when and how to apply available methods based on a comparative analysis of the current state of the practice. Holistic approach tackles the problem considering the overall network condition, while the sequential approach is easier to implement and understand, but may lead to solutions far from optimal. Scenarios defining the suitability of these approaches are defined. Finally, an iterative approach gathering the advantages of traditional approaches is proposed and applied in a case study. The proposed approach considers the overall network condition in a simpler and more intuitive manner than the holistic approach.

  3. Internet-Mediated Technologies and Mixed Methods Research: Problems and Prospects

    ERIC Educational Resources Information Center

    Hesse-Biber, Sharlene; Griffin, Amy J.

    2013-01-01

    This article provides an examination of a range of mixed methods research projects that employ Internet-mediated technologies (IMT) for data collection. Using a case study approach, this article allows for the uncovering of a process by which IMT are used as a data collection medium in mixed methods praxis. Under the theoretical position of medium…

  4. SU-F-T-192: Study of Robustness Analysis Method of Multiple Field Optimized IMPT Plans for Head & Neck Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; Wang, X; Li, H

    Purpose: Proton therapy is more sensitive to uncertainties than photon treatments due to protons’ finite range depending on the tissue density. Worst case scenario (WCS) method originally proposed by Lomax has been adopted in our institute for robustness analysis of IMPT plans. This work demonstrates that WCS method is sufficient enough to take into account of the uncertainties which could be encountered during daily clinical treatment. Methods: A fast and approximate dose calculation method is developed to calculate the dose for the IMPT plan under different setup and range uncertainties. Effects of two factors, inversed square factor and range uncertainty,more » are explored. WCS robustness analysis method was evaluated using this fast dose calculation method. The worst-case dose distribution was generated by shifting isocenter by 3 mm along x,y and z directions and modifying stopping power ratios by ±3.5%. 1000 randomly perturbed cases in proton range and x, yz directions were created and the corresponding dose distributions were calculated using this approximated method. DVH and dosimetric indexes of all 1000 perturbed cases were calculated and compared with the result using worst case scenario method. Results: The distributions of dosimetric indexes of 1000 perturbed cases were generated and compared with the results using worst case scenario. For D95 of CTVs, at least 97% of 1000 perturbed cases show higher values than the one of worst case scenario. For D5 of CTVs, at least 98% of perturbed cases have lower values than worst case scenario. Conclusion: By extensively calculating the dose distributions under random uncertainties, WCS method was verified to be reliable in evaluating the robustness level of MFO IMPT plans of H&N patients. The extensively sampling approach using fast approximated method could be used in evaluating the effects of different factors on the robustness level of IMPT plans in the future.« less

  5. Correction of high amounts of astigmatism through orthokeratology. A case report

    PubMed Central

    Baertschi, Michael; Wyss, Michael

    2011-01-01

    The purpose of this case report is to introduce a method for a successful treatment of high astigmatism with a new orthokeratology design, called FOKX (Falco Kontaktlinsen, Switzerland). This novel toric orthokeratology contact lens design, the fitting approach and the performance of FOKX lenses will be illustrated in the form of a case report. Correcting astigmatism with orthokeratology offers a new perspective for all patients suffering astigmatism.

  6. Hybrid sentiment analysis utilizing multiple indicators to determine temporal shifts of opinion in OSNs

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Hall, Robert T.; Fields, Jeremy; White, Holly M.

    2016-05-01

    Utilization of traditional sentiment analysis for predicting the outcome of an event on a social network depends on: precise understanding of what topics relate to the event, selective elimination of trends that don't fit, and in most cases, expert knowledge of major players of the event. Sentiment analysis has traditionally taken one of two approaches to derive a quantitative value from qualitative text. These approaches include the bag of words model", and the usage of "NLP" to attempt a real understanding of the text. Each of these methods yield very similar accuracy results with the exception of some special use cases. To do so, however, they both impose a large computational burden on the analytic system. Newer approaches have this same problem. No matter what approach is used, SA typically caps out around 80% in accuracy. However, accuracy is the result of both polarity and degree of polarity, nothing else. In this paper we present a method for hybridizing traditional SA methods to better determine shifts in opinion over time within social networks. This hybridization process involves augmenting traditional SA measurements with contextual understanding, and knowledge about writers' demographics. Our goal is to not only to improve accuracy, but to do so with minimal impact to computation requirements.

  7. A Hypermedia Approach to the Design of an Intelligent Tutoring System

    DTIC Science & Technology

    1991-09-01

    23 3. Artist and Exploration Method ........................................... 24 4. Research method...LIMITATIONS AND FUTURE RESEARCH ............................................................... 76 v B. A CASE FOR HYPERMEDIA LEARNING ENVIRONMENTS...119 vi I. INTRODUCTION Most of the prior research in the field of intelligent tutoring systems (ITS) has focused on

  8. Exact Green's function method of solar force-free magnetic-field computations with constant alpha. I - Theory and basic test cases

    NASA Technical Reports Server (NTRS)

    Chiu, Y. T.; Hilton, H. H.

    1977-01-01

    Exact closed-form solutions to the solar force-free magnetic-field boundary-value problem are obtained for constant alpha in Cartesian geometry by a Green's function approach. The uniqueness of the physical problem is discussed. Application of the exact results to practical solar magnetic-field calculations is free of series truncation errors and is at least as economical as the approximate methods currently in use. Results of some test cases are presented.

  9. Metastable electronic states in uranium tetrafluoride

    DOE PAGES

    Miskowiec, Andrew J.

    2018-04-03

    Here, the DFT+ U approach, where U is the Hubbard-like on-site Coulomb interaction, has successfully been used to improve the description of transition metal oxides and other highly correlated systems, including actinides. The secret of the DFT+ U approach is the breaking of d or f shell orbital degeneracy and adding an additional energetic penalty to non-integer occupation of orbitals. A prototypical test case, UO 2, benefits from the + U approach whereby the bare LDA method predicts UO 2 to be a ferromagnetic metal, whereas LDA+ U correctly predicts UO 2 to be insulating. However, the concavity of themore » energetic penalty in the DFT+ U approach can lead to a number of convergent “metastable” electronic configurations residing above the ground state. Uranium tetrafluoride (UF 4) represents a more complex analogy to UO 2 in that the crystal field has lower symmetry and the unit cell contains two symmetrically distinct U atoms. We explore the metastable states in UF 4 using several different methods of selecting initial orbital occupations. Two methods, a “pre-relaxation” method wherein an initial set of orbital eigenvectors is selected via the self-consistency procedure and a crystal rotation method wherein the x, y, z axes are brought into alignment with the crystal field, are explored. We show that in the case of UF 4, which has non-collinearity between its crystal axes and the U atoms' crystal field potentials, the orbital occupation matrices are much more complex and should be analyzed using a novel approach. In addition to demonstrating a complex landscape of metastable electronic states, UF 4 also shows significant hybridization in U–F bonding, which involves non-trivial contributions from s, p, d, and f orbitals.« less

  10. Metastable electronic states in uranium tetrafluoride

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miskowiec, Andrew J.

    Here, the DFT+ U approach, where U is the Hubbard-like on-site Coulomb interaction, has successfully been used to improve the description of transition metal oxides and other highly correlated systems, including actinides. The secret of the DFT+ U approach is the breaking of d or f shell orbital degeneracy and adding an additional energetic penalty to non-integer occupation of orbitals. A prototypical test case, UO 2, benefits from the + U approach whereby the bare LDA method predicts UO 2 to be a ferromagnetic metal, whereas LDA+ U correctly predicts UO 2 to be insulating. However, the concavity of themore » energetic penalty in the DFT+ U approach can lead to a number of convergent “metastable” electronic configurations residing above the ground state. Uranium tetrafluoride (UF 4) represents a more complex analogy to UO 2 in that the crystal field has lower symmetry and the unit cell contains two symmetrically distinct U atoms. We explore the metastable states in UF 4 using several different methods of selecting initial orbital occupations. Two methods, a “pre-relaxation” method wherein an initial set of orbital eigenvectors is selected via the self-consistency procedure and a crystal rotation method wherein the x, y, z axes are brought into alignment with the crystal field, are explored. We show that in the case of UF 4, which has non-collinearity between its crystal axes and the U atoms' crystal field potentials, the orbital occupation matrices are much more complex and should be analyzed using a novel approach. In addition to demonstrating a complex landscape of metastable electronic states, UF 4 also shows significant hybridization in U–F bonding, which involves non-trivial contributions from s, p, d, and f orbitals.« less

  11. Integrative management of commercialized wild mushroom: a case study of Thelephora ganbajun in Yunnan, southwest China.

    PubMed

    He, Jun; Zhou, Zhimei; Yang, Huixian; Xu, Jianchu

    2011-07-01

    The management of wild mushroom is interdisciplinary in nature, whereby the biophysical considerations have to be incorporated into the context of a wide range of social, economic and political concerns. However, to date, little documentation exists illustrating an interdisciplinary approach to management of wild mushrooms. Moreover, the empirical case studies necessary for developing applicable and practical methods are even more rare. This paper adopted an interdisciplinary approach combining participatory methods to improve the habitat management of Thelephora ganbajun, an endemic and one of the most economically valuable mushroom species in Southwest China. The paper documents an empirical case of how an interdisciplinary approach facilitated the development of a scientific basis for policy and management practice, and built the local capacity to create, adopt and sustain the new rules and techniques of mushroom management. With this integrative perspective, a sustainable management strategy was developed, which was found not only technically feasible for farmers, but also acceptable to the government from an ecological and policy-related perspective. More importantly, this approach has greatly contributed to raising the income of farmers. The paper highlights how the integration of biophysical and socioeconomic factors and different knowledge systems provided a holistic perspective to problem diagnosis and resolution, which helped to cope with conventional scientific dilemmas. Finally, it concludes that the success of this interdisciplinary approach is significant in the context of policy decentralization and reform for incorporating indigenous knowledge and local participation in forest management.

  12. Integrated Approaches to Testing and Assessment: OECD Activities on the Development and Use of Adverse Outcome Pathways and Case Studies.

    PubMed

    Sakuratani, Yuki; Horie, Masashi; Leinala, Eeva

    2018-01-09

    The Organisation for Economic Co-operation and Development (OECD) works with member countries and other stakeholders to improve and harmonize chemical assessment methods. In 2012, the OECD Adverse Outcome Pathways (AOPs) Development Programme started. The Programme has published six AOPs thus far and more than 60 AOPs are under various stages of development under the Programme. This article reviews recent OECD activities on the use of AOPs in developing Integrated Approaches to Testing and Assessments (IATAs). The guidance document for the use of AOPs in developing IATA, published in 2016, provides a framework for developing and using IATA and describes how IATA can be based on an AOP. The guidance document on the reporting of defined approaches to be used within IATA, also published in 2016, provides a set of principles for reporting defined approaches to testing and assessment to facilitate their evaluation. In the guidance documents, the AOP concept plays an important role for building IATA approaches in a science-based and transparent way. In 2015, the IATA Case Studies Project was launched to increase experience with the use of IATA and novel hazard methodologies by developing case studies, which constitute examples of predictions that are fit-for-regulatory use. This activity highlights the importance of international collaboration for harmonizing and improving chemical safety assessment methods. © 2018 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  13. Integrative Management of Commercialized Wild Mushroom: A Case Study of Thelephora ganbajun in Yunnan, Southwest China

    NASA Astrophysics Data System (ADS)

    He, Jun; Zhou, Zhimei; Yang, Huixian; Xu, Jianchu

    2011-07-01

    The management of wild mushroom is interdisciplinary in nature, whereby the biophysical considerations have to be incorporated into the context of a wide range of social, economic and political concerns. However, to date, little documentation exists illustrating an interdisciplinary approach to management of wild mushrooms. Moreover, the empirical case studies necessary for developing applicable and practical methods are even more rare. This paper adopted an interdisciplinary approach combining participatory methods to improve the habitat management of Thelephora ganbajun, an endemic and one of the most economically valuable mushroom species in Southwest China. The paper documents an empirical case of how an interdisciplinary approach facilitated the development of a scientific basis for policy and management practice, and built the local capacity to create, adopt and sustain the new rules and techniques of mushroom management. With this integrative perspective, a sustainable management strategy was developed, which was found not only technically feasible for farmers, but also acceptable to the government from an ecological and policy-related perspective. More importantly, this approach has greatly contributed to raising the income of farmers. The paper highlights how the integration of biophysical and socioeconomic factors and different knowledge systems provided a holistic perspective to problem diagnosis and resolution, which helped to cope with conventional scientific dilemmas. Finally, it concludes that the success of this interdisciplinary approach is significant in the context of policy decentralization and reform for incorporating indigenous knowledge and local participation in forest management.

  14. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach.

    PubMed

    Park, Hyunseok; Magee, Christopher L

    2017-01-01

    The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents.

  15. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach

    PubMed Central

    2017-01-01

    The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents. PMID:28135304

  16. Worst-Case Flutter Margins from F/A-18 Aircraft Aeroelastic Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty

    1997-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, micron, computes a stability margin which directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The micron margins are robust margins which indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 SRA using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  17. Electronic Versus Manual Data Processing: Evaluating the Use of Electronic Health Records in Out-of-Hospital Clinical Research

    PubMed Central

    Newgard, Craig D.; Zive, Dana; Jui, Jonathan; Weathers, Cody; Daya, Mohamud

    2011-01-01

    Objectives To compare case ascertainment, agreement, validity, and missing values for clinical research data obtained, processed, and linked electronically from electronic health records (EHR), compared to “manual” data processing and record abstraction in a cohort of out-ofhospital trauma patients. Methods This was a secondary analysis of two sets of data collected for a prospective, population-based, out-of-hospital trauma cohort evaluated by 10 emergency medical services (EMS) agencies transporting to 16 hospitals, from January 1, 2006 through October 2, 2007. Eighteen clinical, operational, procedural, and outcome variables were collected and processed separately and independently using two parallel data processing strategies, by personnel blinded to patients in the other group. The electronic approach included electronic health record data exports from EMS agencies, reformatting and probabilistic linkage to outcomes from local trauma registries and state discharge databases. The manual data processing approach included chart matching, data abstraction, and data entry by a trained abstractor. Descriptive statistics, measures of agreement, and validity were used to compare the two approaches to data processing. Results During the 21-month period, 418 patients underwent both data processing methods and formed the primary cohort. Agreement was good to excellent (kappa 0.76 to 0.97; intraclass correlation coefficient 0.49 to 0.97), with exact agreement in 67% to 99% of cases, and a median difference of zero for all continuous and ordinal variables. The proportions of missing out-of-hospital values were similar between the two approaches, although electronic processing generated more missing outcomes (87 out of 418, 21%, 95% CI = 17% to 25%) than the manual approach (11 out of 418, 3%, 95% CI = 1% to 5%). Case ascertainment of eligible injured patients was greater using electronic methods (n = 3,008) compared to manual methods (n = 629). Conclusions In this sample of out-of-hospital trauma patients, an all-electronic data processing strategy identified more patients and generated values with good agreement and validity compared to traditional data collection and processing methods. PMID:22320373

  18. Family members of older persons with multi-morbidity and their experiences of case managers in Sweden: an interpretive phenomenological approach

    PubMed Central

    Hjelm, Markus; Holmgren, Ann-Charlotte; Willman, Ania; Bohman, Doris; Holst, Göran

    2015-01-01

    Background Family members of older persons (75+) with multi-morbidity are likely to benefit from utilising case management services performed by case managers. However, research has not yet explored their experiences of case managers. Objectives The aim of the study was to deepen the understanding of the importance of case managers to family members of older persons (75+) with multi-morbidity. Design The study design was based on an interpretive phenomenological approach. Method Data were collected through individual interviews with 16 family members in Sweden. The interviews were analysed by means of an interpretive phenomenological approach. Results The findings revealed one overarching theme: “Helps to fulfil my unmet needs”, based on three sub-themes: (1) “Helps me feel secure – Experiencing a trusting relationship”, (2) “Confirms and strengthens me – Challenging my sense of being alone” and (3) “Being my personal guide – Increasing my competence”. Conclusion and discussion The findings indicate that case managers were able to fulfil unmet needs of family members. The latter recognised the importance of case managers providing them with professional services tailored to their individual needs. The findings can contribute to the improvement of case management models not only for older persons but also for their family members. PMID:25918497

  19. Linking biomedical engineering ethics case study approach and policy.

    PubMed

    Dibrell, William; Dobie, Elizabeth Ann

    2007-01-01

    In this paper we link bioengineering case study methods to the development of policy. The case study approach to ethics is an excellent way to show the complex nature of practical/moral reasoning. This approach can, however, lead to a kind of overwhelming complexity. The individual nature of each case makes it difficult to identify the most important information and difficult to see what moral considerations are most relevant. In order to make the overwhelming complexity less debilitating, we present a framework for moral decision making derived from suggestions made by W.D. Ross and Virginia Held. Ross articulates the multiple sources of morality and Held deepens the discussion by reminding us of the foundational importance of care and sympathy to our moral natures. We show how to use the notion of prima facie duty and discuss moral conflict. In doing this, we show how the framework, applied to cases, can be of assistance in helping us develop policies and codes of ethics with sufficient plasticity to be useful in the complex world of the bioengineer.

  20. A robust approach to using of the redundant information in the temperature calibration

    NASA Astrophysics Data System (ADS)

    Strnad, R.; Kňazovická, L.; Šindelář, M.; Kukal, J.

    2013-09-01

    In the calibration laboratories are used standard procedures for calculating of the calibration model coefficients based on well described standards (EN 60751, ITS-90, EN 60584, etc.). In practice, sensors are mostly calibrated in more points and redundant information is used as a validation of the model. This paper will present the influence of including all measured points with respect to their uncertainties to the measured models using standard weighted least square methods. A special case with regards of the different level of the uncertainty of the measured points in case of the robust approach will be discussed. This will go to the different minimization criteria and different uncertainty propagation methodology. This approach also will eliminate of the influence of the outline measurements in the calibration. In practical part will be three cases of this approach presented, namely industrial calibration according to the standard EN 60751, SPRT according to the ITS-90 and thermocouple according to the standard EN 60584.

  1. Patients who make terrible therapeutic choices.

    PubMed

    Curzer, Howard J

    2014-01-01

    The traditional approaches to dental ethics include appeals to principles, duties (deontology), and consequences (utilitarianism). These approaches are often inadequate when faced with the case of a patient who refuses reasonable treatment and does not share the same ethical framework the dentist is using. An approach based on virtue ethics may be helpful in this and other cases. Virtue ethics is a tradition going back to Plato and Aristotle. It depends on forming a holistic character supporting general appropriate behavior. By correctly diagnosing the real issues at stake in a patient's inappropriate oral health choices and working to build effective habits, dentists can sometimes respond to ethical challenges that remain intractable given rule-based methods.

  2. A Generalized Approach to Forensic Dye Identification: Development and Utility of Reference Libraries.

    PubMed

    Groves, Ethan; Palenik, Skip; Palenik, Christopher S

    2018-04-18

    While color is arguably the most important optical property of evidential fibers, the actual dyestuffs responsible for its expression in them are, in forensic trace evidence examinations, rarely analyzed and still less often identified. This is due, primarily, to the exceedingly small quantities of dye present in a single fiber as well as to the fact that dye identification is a challenging analytical problem, even when large quantities are available for analysis. Among the practical reasons for this are the wide range of dyestuffs available (and the even larger number of trade names), the low total concentration of dyes in the finished product, the limited amount of sample typically available for analysis in forensic cases, and the complexity of the dye mixtures that may exist within a single fiber. Literature on the topic of dye analysis is often limited to a specific method, subset of dyestuffs, or an approach that is not applicable given the constraints of a forensic analysis. Here, we present a generalized approach to dye identification that ( 1 ) combines several robust analytical methods, ( 2 ) is broadly applicable to a wide range of dye chemistries, application classes, and fiber types, and ( 3 ) can be scaled down to forensic casework-sized samples. The approach is based on the development of a reference collection of 300 commercially relevant textile dyes that have been characterized by a variety of microanalytical methods (HPTLC, Raman microspectroscopy, infrared microspectroscopy, UV-Vis spectroscopy, and visible microspectrophotometry). Although there is no single approach that is applicable to all dyes on every type of fiber, a combination of these analytical methods has been applied using a reproducible approach that permits the use of reference libraries to constrain the identity of and, in many cases, identify the dye (or dyes) present in a textile fiber sample.

  3. A Risk-Based Approach to Variable Load Configuration Validation in Steam Sterilization: Application of PDA Technical Report 1 Load Equivalence Topic.

    PubMed

    Pavell, Anthony; Hughes, Keith A

    2010-01-01

    This article describes a method for achieving the load equivalence model, described in Parenteral Drug Association Technical Report 1, using a mass-based approach. The item and load bracketing approach allows for mixed equipment load size variation for operational flexibility along with decreased time to introduce new items to the operation. The article discusses the utilization of approximately 67 items/components (Table IV) identified for routine sterilization with varying quantities required weekly. The items were assessed for worst-case identification using four temperature-related criteria. The criteria were used to provide a data-based identification of worst-case items, and/or item equivalence, to carry forward into cycle validation using a variable load pattern. The mass approach to maximum load determination was used to bracket routine production use and allows for variable loading patterns. The result of the item mapping and load bracketing data is "a proven acceptable range" of sterilizing conditions including loading configuration and location. The application of these approaches, while initially more time/test-intensive than alternate approaches, provides a method of cycle validation with long-term benefit of ease of ongoing qualification, minimizing time and requirements for new equipment qualification for similar loads/use, and for rapid and rigorous assessment of new items for sterilization.

  4. Comparing and Contrasting Consensus versus Empirical Domains

    PubMed Central

    Jason, Leonard A.; Kot, Bobby; Sunnquist, Madison; Brown, Abigail; Reed, Jordan; Furst, Jacob; Newton, Julia L.; Strand, Elin Bolle; Vernon, Suzanne D.

    2015-01-01

    Background Since the publication of the CFS case definition [1], there have been a number of other criteria proposed including the Canadian Consensus Criteria [2] and the Myalgic Encephalomyelitis: International Consensus Criteria. [3] Purpose The current study compared these domains that were developed through consensus methods to one obtained through more empirical approaches using factor analysis. Methods Using data mining, we compared and contrasted fundamental features of consensus-based criteria versus empirical latent factors. In general, these approaches found the domain of Fatigue/Post-exertional malaise as best differentiating patients from controls. Results Findings indicated that the Fukuda et al. criteria had the worst sensitivity and specificity. Conclusions These outcomes might help both theorists and researchers better determine which fundamental domains to be used for the case definition. PMID:26977374

  5. Forecasting peaks of seasonal influenza epidemics.

    PubMed

    Nsoesie, Elaine; Mararthe, Madhav; Brownstein, John

    2013-06-21

    We present a framework for near real-time forecast of influenza epidemics using a simulation optimization approach. The method combines an individual-based model and a simple root finding optimization method for parameter estimation and forecasting. In this study, retrospective forecasts were generated for seasonal influenza epidemics using web-based estimates of influenza activity from Google Flu Trends for 2004-2005, 2007-2008 and 2012-2013 flu seasons. In some cases, the peak could be forecasted 5-6 weeks ahead. This study adds to existing resources for influenza forecasting and the proposed method can be used in conjunction with other approaches in an ensemble framework.

  6. Evaluation of the Grammar Teaching Process by Using the Methods Used in Turkish Language Teaching as a Foreign Language: A Case Study

    ERIC Educational Resources Information Center

    Öztürk, Basak Karakoç

    2018-01-01

    The preferred methods for the success of foreign language teaching and the reflection of these methods on the teaching process are very important. Since approaches and methods in language teaching enable the teacher to use different techniques in his/her lectures, they provide a more effective teaching process. The methodology in teaching the…

  7. An exchange format for use-cases of hospital information systems.

    PubMed

    Masuda, G; Sakamoto, N; Sakai, R; Yamamoto, R

    2001-01-01

    Object-oriented software development is a powerful methodology for development of large hospital information systems. We think use-case driven approach is particularly useful for the development. In the use-cases driven approach, use-cases are documented at the first stage in the software development process and they are used through the whole steps in a variety of ways. Therefore, it is important to exchange and share the use-cases and make effective use of them through the overall lifecycle of a development process. In this paper, we propose a method of sharing and exchanging use-case models between applications, developers, and projects. We design an XML based exchange format for use-cases. We then discuss an application of the exchange format to support several software development activities. We preliminarily implemented a support system for object-oriented analysis based on the exchange format. The result shows that using the structural and semantic information in the exchange format enables the support system to assist the object-oriented analysis successfully.

  8. Integrating community-based participatory research and informatics approaches to improve the engagement and health of underserved populations

    PubMed Central

    Schaefbauer, Chris L; Campbell, Terrance R; Senteio, Charles; Siek, Katie A; Bakken, Suzanne; Veinot, Tiffany C

    2016-01-01

    Objective We compare 5 health informatics research projects that applied community-based participatory research (CBPR) approaches with the goal of extending existing CBPR principles to address issues specific to health informatics research. Materials and methods We conducted a cross-case analysis of 5 diverse case studies with 1 common element: integration of CBPR approaches into health informatics research. After reviewing publications and other case-related materials, all coauthors engaged in collaborative discussions focused on CBPR. Researchers mapped each case to an existing CBPR framework, examined each case individually for success factors and barriers, and identified common patterns across cases. Results Benefits of applying CBPR approaches to health informatics research across the cases included the following: developing more relevant research with wider impact, greater engagement with diverse populations, improved internal validity, more rapid translation of research into action, and the development of people. Challenges of applying CBPR to health informatics research included requirements to develop strong, sustainable academic-community partnerships and mismatches related to cultural and temporal factors. Several technology-related challenges, including needs to define ownership of technology outputs and to build technical capacity with community partners, also emerged from our analysis. Finally, we created several principles that extended an existing CBPR framework to specifically address health informatics research requirements. Conclusions Our cross-case analysis yielded valuable insights regarding CBPR implementation in health informatics research and identified valuable lessons useful for future CBPR-based research. The benefits of applying CBPR approaches can be significant, particularly in engaging populations that are typically underserved by health care and in designing patient-facing technology. PMID:26228766

  9. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    PubMed

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  10. An efficient graph theory based method to identify every minimal reaction set in a metabolic network

    PubMed Central

    2014-01-01

    Background Development of cells with minimal metabolic functionality is gaining importance due to their efficiency in producing chemicals and fuels. Existing computational methods to identify minimal reaction sets in metabolic networks are computationally expensive. Further, they identify only one of the several possible minimal reaction sets. Results In this paper, we propose an efficient graph theory based recursive optimization approach to identify all minimal reaction sets. Graph theoretical insights offer systematic methods to not only reduce the number of variables in math programming and increase its computational efficiency, but also provide efficient ways to find multiple optimal solutions. The efficacy of the proposed approach is demonstrated using case studies from Escherichia coli and Saccharomyces cerevisiae. In case study 1, the proposed method identified three minimal reaction sets each containing 38 reactions in Escherichia coli central metabolic network with 77 reactions. Analysis of these three minimal reaction sets revealed that one of them is more suitable for developing minimal metabolism cell compared to other two due to practically achievable internal flux distribution. In case study 2, the proposed method identified 256 minimal reaction sets from the Saccharomyces cerevisiae genome scale metabolic network with 620 reactions. The proposed method required only 4.5 hours to identify all the 256 minimal reaction sets and has shown a significant reduction (approximately 80%) in the solution time when compared to the existing methods for finding minimal reaction set. Conclusions Identification of all minimal reactions sets in metabolic networks is essential since different minimal reaction sets have different properties that effect the bioprocess development. The proposed method correctly identified all minimal reaction sets in a both the case studies. The proposed method is computationally efficient compared to other methods for finding minimal reaction sets and useful to employ with genome-scale metabolic networks. PMID:24594118

  11. Automated structure solution, density modification and model building.

    PubMed

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  12. Experiences in methods to involve key players in planning protective actions in the case of a nuclear accident.

    PubMed

    Sinkko, K; Hämäläinen, R P; Hänninen, R

    2004-01-01

    A widely used method in the planning of protective actions is to establish a stakeholder network to generate a comprehensive set of generic protective actions. The aim is to increase competence and build links for communication and coordination. The approach of this work was to systematically evaluate protective action strategies in the case of a nuclear accident. This was done in a way that the concerns and issues of all key players could be transparently and equally included in the decision taken. An approach called Facilitated Decision Analysis Workshop has been developed and tested. The work builds on case studies in which it was assumed that a hypothetical accident had led to a release of considerable amounts of radionuclides and, therefore, various types of countermeasures had to be considered. Six workshops were organised in the Nordic countries where the key players were represented, i.e. authorities, expert organisations, industry and agricultural producers. Copyright 2004 Oxford University Press

  13. An Ecological Approach to the On-Line Assessment of Problem-Solving Paths: Principles and Applications.

    ERIC Educational Resources Information Center

    Shaw, Robert E.; And Others

    1997-01-01

    Proposes a theoretical framework for designing online-situated assessment tools for multimedia instructional systems. Uses a graphic method based on ecological psychology to monitor student performance through a learning activity. Explores the method's feasibility in case studies describing instructional systems teaching critical-thinking and…

  14. Presenting the Iterative Curriculum Discourse Analysis (ICDA) Approach

    ERIC Educational Resources Information Center

    Iversen, Lars Laird

    2014-01-01

    The article presents a method for analysing recurring curriculum documents using discourse theory inspired by Ernesto Laclau and Chantal Mouffe. The article includes a presentation of the method in seven practical steps, and is illustrated and discussed throughout using the author's recent case study on religion, identity and values in Norwegian…

  15. Nursing Admission Practices to Discern "Fit": A Case Study Exemplar

    ERIC Educational Resources Information Center

    Sinutko, Jaime M.

    2014-01-01

    Admission to a baccalaureate nursing school in the United States is currently a challenging proposition for a variety of reasons. This research explored a holistic nursing school admission process at a small, private, baccalaureate college using a retrospective, mixed-method, approach. The holistic method included multiple admission criteria, both…

  16. Selecting the Right Construction Delivery Method for a Specific Project.

    ERIC Educational Resources Information Center

    Klinger, Jeff; Booth, Scott

    2002-01-01

    Discusses the costs and benefits of various construction delivery methods for higher education facility projects, including the traditional lump sum general contracting approach (also known as design/bid/build); design-build; and, in the case of private institutions, guaranteed maximum pricing offered by those firms willing to perform construction…

  17. Research in Distance Education: A System Modeling Approach.

    ERIC Educational Resources Information Center

    Saba, Farhad; Twitchell, David

    This demonstration of the use of a computer simulation research method based on the System Dynamics modeling technique for studying distance education reviews research methods in distance education, including the broad categories of conceptual and case studies, and presents a rationale for the application of systems research in this area. The…

  18. The role of mixed methods in improved cookstove research.

    PubMed

    Stanistreet, Debbi; Hyseni, Lirije; Bashin, Michelle; Sadumah, Ibrahim; Pope, Daniel; Sage, Michael; Bruce, Nigel

    2015-01-01

    The challenge of promoting access to clean and efficient household energy for cooking and heating is a critical issue facing low- and middle-income countries today. Along with clean fuels, improved cookstoves (ICSs) continue to play an important part in efforts to reduce the 4 million annual premature deaths attributed to household air pollution. Although a range of ICSs are available, there is little empirical evidence on appropriate behavior change approaches to inform adoption and sustained used at scale. Specifically, evaluations using either quantitative or qualitative methods provide an incomplete picture of the challenges in facilitating ICS adoption. This article examines how studies that use the strengths of both these approaches can offer important insights into behavior change in relation to ICS uptake and scale-up. Epistemological approaches, study design frameworks, methods of data collection, analytical approaches, and issues of validity and reliability in the context of mixed methods ICS research are examined, and the article presents an example study design from an evaluation study in Kenya incorporating a nested approach and a convergent case oriented design. The authors discuss the benefits and methodological challenges of mixed-methods approaches in the context of researching behavior change and ICS use recognizing that such methods represent relatively uncharted territory. The authors propose that more published examples are needed to provide frameworks for other researchers seeking to apply mixed methods in this context and suggest a comprehensive research agenda is required that incorporates integrated mixed-methods approaches, to provide best evidence for future scale-up.

  19. Interprofessional practice and learning in a youth mental health service: A case study using network analysis.

    PubMed

    Barnett, Tony; Hoang, Ha; Cross, Merylin; Bridgman, Heather

    2015-01-01

    Few studies have examined interprofessional practice (IPP) from a mental health service perspective. This study applied a mixed-method approach to examine the IPP and learning occurring in a youth mental health service in Tasmania, Australia. The aims of the study were to investigate the extent to which staff were networked, how collaboratively they practiced and supported student learning, and to elicit the organisation's strengths and opportunities regarding IPP and learning. Six data sets were collected: pre- and post-test readiness for interprofessional learning surveys, Social Network survey, organisational readiness for IPP and learning checklist, "talking wall" role clarification activity, and observations of participants working through a clinical case study. Participants (n = 19) were well-networked and demonstrated a patient-centred approach. Results confirmed participants' positive attitudes to IPP and learning and identified ways to strengthen the organisation's interprofessional capability. This mixed-method approach could assist others to investigate IPP and learning.

  20. Intrinsic ethics regarding integrated assessment models for climate management.

    PubMed

    Schienke, Erich W; Baum, Seth D; Tuana, Nancy; Davis, Kenneth J; Keller, Klaus

    2011-09-01

    In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes.

  1. A case study for a psychographic-behavioral segmentation approach for targeted demand generation in voluntary medical male circumcision.

    PubMed

    Sgaier, Sema K; Eletskaya, Maria; Engl, Elisabeth; Mugurungi, Owen; Tambatamba, Bushimbwa; Ncube, Gertrude; Xaba, Sinokuthemba; Nanga, Alice; Gogolina, Svetlana; Odawo, Patrick; Gumede-Moyo, Sehlulekile; Kretschmer, Steve

    2017-09-13

    Public health programs are starting to recognize the need to move beyond a one-size-fits-all approach in demand generation, and instead tailor interventions to the heterogeneity underlying human decision making. Currently, however, there is a lack of methods to enable such targeting. We describe a novel hybrid behavioral-psychographic segmentation approach to segment stakeholders on potential barriers to a target behavior. We then apply the method in a case study of demand generation for voluntary medical male circumcision (VMMC) among 15-29 year-old males in Zambia and Zimbabwe. Canonical correlations and hierarchical clustering techniques were applied on representative samples of men in each country who were differentiated by their underlying reasons for their propensity to get circumcised. We characterized six distinct segments of men in Zimbabwe, and seven segments in Zambia, according to their needs, perceptions, attitudes and behaviors towards VMMC, thus highlighting distinct reasons for a failure to engage in the desired behavior.

  2. A case study for a psychographic-behavioral segmentation approach for targeted demand generation in voluntary medical male circumcision

    PubMed Central

    Eletskaya, Maria; Engl, Elisabeth; Mugurungi, Owen; Tambatamba, Bushimbwa; Ncube, Gertrude; Xaba, Sinokuthemba; Nanga, Alice; Gogolina, Svetlana; Odawo, Patrick; Gumede-Moyo, Sehlulekile; Kretschmer, Steve

    2017-01-01

    Public health programs are starting to recognize the need to move beyond a one-size-fits-all approach in demand generation, and instead tailor interventions to the heterogeneity underlying human decision making. Currently, however, there is a lack of methods to enable such targeting. We describe a novel hybrid behavioral-psychographic segmentation approach to segment stakeholders on potential barriers to a target behavior. We then apply the method in a case study of demand generation for voluntary medical male circumcision (VMMC) among 15–29 year-old males in Zambia and Zimbabwe. Canonical correlations and hierarchical clustering techniques were applied on representative samples of men in each country who were differentiated by their underlying reasons for their propensity to get circumcised. We characterized six distinct segments of men in Zimbabwe, and seven segments in Zambia, according to their needs, perceptions, attitudes and behaviors towards VMMC, thus highlighting distinct reasons for a failure to engage in the desired behavior. PMID:28901285

  3. Simulation of solute transport across low-permeability barrier walls

    USGS Publications Warehouse

    Harte, P.T.; Konikow, Leonard F.; Hornberger, G.Z.

    2006-01-01

    Low-permeability, non-reactive barrier walls are often used to contain contaminants in an aquifer. Rates of solute transport through such barriers are typically many orders of magnitude slower than rates through the aquifer. Nevertheless, the success of remedial actions may be sensitive to these low rates of transport. Two numerical simulation methods for representing low-permeability barriers in a finite-difference groundwater-flow and transport model were tested. In the first method, the hydraulic properties of the barrier were represented directly on grid cells and in the second method, the intercell hydraulic-conductance values were adjusted to approximate the reduction in horizontal flow, allowing use of a coarser and computationally efficient grid. The alternative methods were tested and evaluated on the basis of hypothetical test problems and a field case involving tetrachloroethylene (PCE) contamination at a Superfund site in New Hampshire. For all cases, advective transport across the barrier was negligible, but preexisting numerical approaches to calculate dispersion yielded dispersive fluxes that were greater than expected. A transport model (MODFLOW-GWT) was modified to (1) allow different dispersive and diffusive properties to be assigned to the barrier than the adjacent aquifer and (2) more accurately calculate dispersion from concentration gradients and solute fluxes near barriers. The new approach yields reasonable and accurate concentrations for the test cases. ?? 2006.

  4. Detection of bifurcations in noisy coupled systems from multiple time series

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Mark S., E-mail: m.s.williamson@exeter.ac.uk; Lenton, Timothy M.

    We generalize a method of detecting an approaching bifurcation in a time series of a noisy system from the special case of one dynamical variable to multiple dynamical variables. For a system described by a stochastic differential equation consisting of an autonomous deterministic part with one dynamical variable and an additive white noise term, small perturbations away from the system's fixed point will decay slower the closer the system is to a bifurcation. This phenomenon is known as critical slowing down and all such systems exhibit this decay-type behaviour. However, when the deterministic part has multiple coupled dynamical variables, themore » possible dynamics can be much richer, exhibiting oscillatory and chaotic behaviour. In our generalization to the multi-variable case, we find additional indicators to decay rate, such as frequency of oscillation. In the case of approaching a homoclinic bifurcation, there is no change in decay rate but there is a decrease in frequency of oscillations. The expanded method therefore adds extra tools to help detect and classify approaching bifurcations given multiple time series, where the underlying dynamics are not fully known. Our generalisation also allows bifurcation detection to be applied spatially if one treats each spatial location as a new dynamical variable. One may then determine the unstable spatial mode(s). This is also something that has not been possible with the single variable method. The method is applicable to any set of time series regardless of its origin, but may be particularly useful when anticipating abrupt changes in the multi-dimensional climate system.« less

  5. Detection of bifurcations in noisy coupled systems from multiple time series

    NASA Astrophysics Data System (ADS)

    Williamson, Mark S.; Lenton, Timothy M.

    2015-03-01

    We generalize a method of detecting an approaching bifurcation in a time series of a noisy system from the special case of one dynamical variable to multiple dynamical variables. For a system described by a stochastic differential equation consisting of an autonomous deterministic part with one dynamical variable and an additive white noise term, small perturbations away from the system's fixed point will decay slower the closer the system is to a bifurcation. This phenomenon is known as critical slowing down and all such systems exhibit this decay-type behaviour. However, when the deterministic part has multiple coupled dynamical variables, the possible dynamics can be much richer, exhibiting oscillatory and chaotic behaviour. In our generalization to the multi-variable case, we find additional indicators to decay rate, such as frequency of oscillation. In the case of approaching a homoclinic bifurcation, there is no change in decay rate but there is a decrease in frequency of oscillations. The expanded method therefore adds extra tools to help detect and classify approaching bifurcations given multiple time series, where the underlying dynamics are not fully known. Our generalisation also allows bifurcation detection to be applied spatially if one treats each spatial location as a new dynamical variable. One may then determine the unstable spatial mode(s). This is also something that has not been possible with the single variable method. The method is applicable to any set of time series regardless of its origin, but may be particularly useful when anticipating abrupt changes in the multi-dimensional climate system.

  6. Approaches for building community participation: A qualitative case study of Canadian food security programs.

    PubMed

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2017-10-01

    There is increasing opportunity and support for occupational therapists to expand their scope of practice in community settings. However, evidence is needed to increase occupational therapists' knowledge, confidence, and capacity with building community participation and adopting community-centered practice roles. The purpose of this study is to improve occupational therapists' understanding of an approach to building community participation, through case study of a network of Canadian food security programs. Qualitative case study was utilized. Data were semistructured interviews, field observations, documents, and online social media. Thematic analysis was used to identify and describe four themes that relate to processes used to build community participation. The four themes were use of multiple methods, good leaders are fundamental, growing participation via social media, and leveraging outcomes. Occupational therapists can utilize an approach for building community participation that incorporates resource mobilization. Challenges of sustainability and social exclusion must be addressed.

  7. A comparison of email versus letter threat contacts toward members of the United States Congress.

    PubMed

    Schoeneman-Morris, Katherine A; Scalora, Mario J; Chang, Grace H; Zimmerman, William J; Garner, Yancey

    2007-09-01

    To better understand inappropriate correspondence sent to public officials, 301 letter cases and 99 email cases were randomly selected from the United States Capitol Police investigative case files and compared. Results indicate that letter writers were significantly more likely than emailers to exhibit indicators of serious mental illness (SMI), engage in target dispersion, use multiple methods of contact, and make a problematic approach toward their target. Emailers were significantly more likely than letter writers to focus on government concerns, use obscene language, and display disorganization in their writing. Also, letter writers tended to be significantly older, have more criminal history, and write longer communications. A multivariate model found that disorganization, SMI symptoms, problematic physical approach, and target dispersion significantly differentiated between the correspondence groups. The group differences illuminated by this study reveal that letter writers are engaging in behavior that is higher risk for problematic approach than are emailers.

  8. Angiomyolipoma of the Liver: A Rare Benign Tumor Treated with a Laparoscopic Approach for the First Time.

    PubMed Central

    DAMASKOS, CHRISTOS; GARMPIS, NIKOLAOS; ANNA, GARMPI; NONNI, AFRODITI; SAKELLARIOU, STRATIGOULA; MARGONIS, GEORGIOS-ANTONIOS; SPARTALIS, ELEFTHERIOS; SCHIZAS, DIMITRIOS; ANDREATOS, NIKOLAOS; MAGKOUTI, ELENI; GRIVAS, ALEXANDROS; KONTZOGLOU, KONSTANTINOS; ANTONIOU, A. EFSTATHIOS

    2017-01-01

    Background/Aim: Epithelioid angiomyolipoma of the liver is a rare benign mesenchymal tumor that usually presents in adult female patients. It most frequently occurs in the kidney, with the liver being the second most common site of involvement. Angiomyolipoma belongs to a family of tumors arising from perivascular epithelioid cells but in rare cases may also have cystic features. We report our experience via the first case of hepatic angiomyolipoma treated by laparoscopic approach. Patients and Methods: We present the case of a 50-year old female patient complaining of abdominal pain. Abdominal ultrasound (US) and Magnetic Resonance Imaging (MRI) revealed a 5x3cm mass located in the left liver lobe. The tumor was resected with a laparoscopic approach. Microscopic examination of the tumor revealed hepatic angiomyolipoma. Results: Twenty-seven months postoperatively, the patient remains fit and healthy. Conclusion: Angiomyolipoma can be removed by laparoscopy. PMID:29102941

  9. Application of the Multicontextual Approach in Promoting Learning and Transfer of Strategy Use in an Individual with TBI and Executive Dysfunction.

    PubMed

    Toglia, Joan; Goverover, Yael; Johnston, Mark V; Dain, Barry

    2011-01-01

    The multicontext approach addresses strategy use and self-monitoring skills within activities and contexts that are systematically varied to facilitate transfer of learning. This article illustrates the application of the multicontext approach by presenting a case study of an adult who is 5 years post-traumatic brain injury with executive dysfunction and limited awareness. A single case study design with repeated pre-post measures was used. Methods to monitor strategy generation and specific awareness within intervention are described. Findings suggest improved functional performance and generalization of use of an external strategy despite absence of changes in general self-awareness of deficits. This case describes the multicontext intervention process and provides clinical suggestions for working with individuals with serious deficits in awareness and executive dysfunction following traumatic brain injury. Copyright 2011, SLACK Incorporated.

  10. An evaluation of the effectiveness of a case-specific approach to challenging behaviour associated with dementia.

    PubMed

    Bird, Michael; Llewellyn-Jones, Robert H; Korten, Ailsa

    2009-01-01

    Treatment of challenging behaviour in dementia using standardized psychopharmacological or psychosocial approaches remains problematical. A case-specific approach was trialled in this study, based on extensive evidence that each case is different in aetiology, the effects of the behaviour on others and what interventions are possible given the available resources. Forty-four consecutive referrals for challenging behaviour (two-thirds in residential care) were assessed across multiple causal domains. Both assessment and development of interventions were undertaken in collaboration with family carers and care staff. Measures of behaviour and associated carer distress, as well as medication and service use, were taken pre-intervention and at 2- and 5-month follow-ups. Psychotropic medication was used with a minority of participants but, overall, antipsychotic use was reduced. Psychosocial methods predominated, with 77% of cases judged as mainly or entirely psychosocial by an expert panel. There were significant mean improvements in behaviour and carer distress. Using conservative criteria there was a 65.9% clinical success rate. Results confirm those of other studies which have used multifaceted interventions tailored to the unique needs of each case. They compare favourably with results from trials of standardized psycho-pharmacological or psychosocial approaches. More trials are needed, necessarily involving further development of robust methodologies which reflect the case-specific nature of challenging behaviour associated with dementia.

  11. Population viability analysis with species occurrence data from museum collections.

    PubMed

    Skarpaas, Olav; Stabbetorp, Odd E

    2011-06-01

    The most comprehensive data on many species come from scientific collections. Thus, we developed a method of population viability analysis (PVA) in which this type of occurrence data can be used. In contrast to classical PVA, our approach accounts for the inherent observation error in occurrence data and allows the estimation of the population parameters needed for viability analysis. We tested the sensitivity of the approach to spatial resolution of the data, length of the time series, sampling effort, and detection probability with simulated data and conducted PVAs for common, rare, and threatened species. We compared the results of these PVAs with results of standard method PVAs in which observation error is ignored. Our method provided realistic estimates of population growth terms and quasi-extinction risk in cases in which the standard method without observation error could not. For low values of any of the sampling variables we tested, precision decreased, and in some cases biased estimates resulted. The results of our PVAs with the example species were consistent with information in the literature on these species. Our approach may facilitate PVA for a wide range of species of conservation concern for which demographic data are lacking but occurrence data are readily available. ©2011 Society for Conservation Biology.

  12. Case-based Long-term Professional Development of Science Teachers

    NASA Astrophysics Data System (ADS)

    Dori, Yehudit J.; Herscovitz, Orit

    2005-10-01

    Reform efforts are often unsuccessful because they failed to understand that teachers play a key role in making educational reforms successful. This paper describes a long-term teacher professional development (PD) program aimed at educating and training teachers to teach interdisciplinary topics using case-based method in science. The research objective was to identify, follow and document the processes that science teachers went through as they assimilated the interdisciplinary, case-based science teaching approach. The research accompanied the PD program throughout its 3-year period. About 50 teachers, who took part in the PD program, were exposed to an interdisciplinary case-based teaching method. The research instruments included teacher portfolios, which contained projects and reflection questionnaires, classroom observations, teacher interviews, and student feedback questionnaires. The portfolios contained the projects that the teachers had carried out during the PD program, which included case studies and accompanying student activities. We found that the teachers gradually moved from exposure to new teaching methods and subject matter, through active learning and preparing case-based team projects, to interdisciplinary, active classroom teaching using the case studies they developed.

  13. Combining the Cutting and Mulliken methods for primary repair of the bilateral cleft lip nose.

    PubMed

    Morovic, Carmen Gloria; Cutting, Court

    2005-11-01

    Since 1990, primary bilateral cleft nasal reconstruction has been focused on placing the lower lateral cartilages into normal anatomical position. Of the four major techniques in this class, the Cutting (i.e., retrograde) method and the Mulliken method have been most successful. The retrograde method makes no external nasal incisions, but requires either preoperative or postoperative nasal molding to achieve maximum benefit. Mulliken's technique does not require molding, but leaves the footplates of the medial crura in the depression above the projecting premaxilla associated with the diminutive anterior nasal spine. Leaving the footplates in place also prevents adequate approximation of the alar bases. In this article, the two methods are combined to achieve the benefits of both. We report our experience with the retrograde nasal approach associated with marginal rim incisions (Mulliken method) in a series of 25 consecutive bilateral cleft lip cases simultaneous with lip repair. We performed a retrograde approach through membranous septum incisions elevating a prolabial-columellar flap. To facilitate alar cartilage manipulation we added bilateral marginal rim incisions. Nasal width, columella length and width, tip projection, and nasolabial angle were analyzed after a minimum of 2 years after surgery. These were compared with a normal, age-matched, control group. We also examined nostril symmetry and marginal nostril scars. Columellar length was not statistically significantly different from that of the control group (p = 0.122442). Nasal width, columellar width, tip projection, and nasolabial angle were all significantly greater in the cleft group than normal (p < 0.001). No hypertrophied scars were found associated with the marginal rim scar. Adding the Mulliken approach allows alar cartilage manipulation to be performed more easily than when using the retrograde approach alone. Tip projection and alar base narrowing are facilitated using the combined technique rather than the Mulliken approach alone. Prolabial flap manipulation is safe using this combined approach, even in cases with a severely projected premaxilla. We believe that the combined approach is safe and yields better long-term results than either technique alone.

  14. Ground target recognition using rectangle estimation.

    PubMed

    Grönwall, Christina; Gustafsson, Fredrik; Millnert, Mille

    2006-11-01

    We propose a ground target recognition method based on 3-D laser radar data. The method handles general 3-D scattered data. It is based on the fact that man-made objects of complex shape can be decomposed to a set of rectangles. The ground target recognition method consists of four steps; 3-D size and orientation estimation, target segmentation into parts of approximately rectangular shape, identification of segments that represent the target's functional/main parts, and target matching with CAD models. The core in this approach is rectangle estimation. The performance of the rectangle estimation method is evaluated statistically using Monte Carlo simulations. A case study on tank recognition is shown, where 3-D data from four fundamentally different types of laser radar systems are used. Although the approach is tested on rather few examples, we believe that the approach is promising.

  15. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  16. A new method for the automatic retrieval of medical cases based on the RadLex ontology.

    PubMed

    Spanier, A B; Cohen, D; Joskowicz, L

    2017-03-01

    The goal of medical case-based image retrieval (M-CBIR) is to assist radiologists in the clinical decision-making process by finding medical cases in large archives that most resemble a given case. Cases are described by radiology reports comprised of radiological images and textual information on the anatomy and pathology findings. The textual information, when available in standardized terminology, e.g., the RadLex ontology, and used in conjunction with the radiological images, provides a substantial advantage for M-CBIR systems. We present a new method for incorporating textual radiological findings from medical case reports in M-CBIR. The input is a database of medical cases, a query case, and the number of desired relevant cases. The output is an ordered list of the most relevant cases in the database. The method is based on a new case formulation, the Augmented RadLex Graph and an Anatomy-Pathology List. It uses a new case relatedness metric [Formula: see text] that prioritizes more specific medical terms in the RadLex tree over less specific ones and that incorporates the length of the query case. An experimental study on 8 CT queries from the 2015 VISCERAL 3D Case Retrieval Challenge database consisting of 1497 volumetric CT scans shows that our method has accuracy rates of 82 and 70% on the first 10 and 30 most relevant cases, respectively, thereby outperforming six other methods. The increasing amount of medical imaging data acquired in clinical practice constitutes a vast database of untapped diagnostically relevant information. This paper presents a new hybrid approach to retrieving the most relevant medical cases based on textual and image information.

  17. Challenges and Rewards on the Road to Translational Systems Biology in Acute Illness: Four Case Reports from Interdisciplinary Teams

    PubMed Central

    An, Gary; Hunt, C. Anthony; Clermont, Gilles; Neugebauer, Edmund; Vodovotz, Yoram

    2007-01-01

    Introduction Translational systems biology approaches can be distinguished from mainstream systems biology in that their goal is to drive novel therapies and streamline clinical trials in critical illness. One systems biology approach, dynamic mathematical modeling (DMM), is increasingly used in dealing with the complexity of the inflammatory response and organ dysfunction. The use of DMM often requires a broadening of research methods and a multidisciplinary team approach that includes bioscientists, mathematicians, engineers, and computer scientists. However, the development of these groups must overcome domain-specific barriers to communication and understanding. Methods We present four case studies of successful translational, interdisciplinary systems biology efforts, which differ by organizational level from an individual to an entire research community. Results Case 1 is a single investigator involved in DMM of the acute inflammatory response at Cook County Hospital, in which extensive translational progress was made using agent-based models of inflammation and organ damage. Case 2 is a community-level effort from the University of Witten-Herdecke in Cologne, whose efforts have led to the formation of the Society for Complexity in Acute Illness. Case 3 is an institution-based group, the Biosystems Group at the University of California, San Francisco, whose work has included a focus on a common lexicon for DMM. Case 4 is an institution-based, trans-disciplinary research group (the Center for Inflammation and Regenerative Modeling at the University of Pittsburgh, whose modeling work has led to internal education efforts, grant support, and commercialization. Conclusion A transdisciplinary approach, which involves team interaction in an iterative fashion to address ambiguity and is supported by educational initiatives, is likely to be necessary for DMM in acute illness. Community-wide organizations such as the Society of Complexity in Acute Illness (SCAI) must strive to facilitate the implementation of DMM in sepsis/trauma research into the research community as a whole. PMID:17548029

  18. Remote sensing for site characterization

    USGS Publications Warehouse

    Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.; Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.

    2000-01-01

    This volume, Remote Sensing for Site Characterization, describes the feasibility of aircraft- and satellite-based methods of revealing environmental-geological problems. A balanced ratio between explanations of the methodological/technical side and presentations of case studies is maintained. The comparison of case studies from North America and Germany show how the respective territorial conditions lead to distinct methodological approaches.

  19. Implementing the Health Promoting School in Denmark: A Case Study

    ERIC Educational Resources Information Center

    Nordin, Lone Lindegaard

    2016-01-01

    Purpose: The purpose of this paper is to provide insight into teachers' practice in implementing school-based health promotion. Design/methodology/approach: This qualitative research was designed as a multiple case study. The study involved five schools, 233 pupils in the age 12-16 and 23 teachers. The primary data generation method were focus…

  20. Containing Pedagogical Complexity through the Assignment of Photography: Two Case Presentations

    ERIC Educational Resources Information Center

    Garrett, H. James; Matthews, Sara

    2014-01-01

    This article investigates the use of photography as a narrative approach to learning in the context of postsecondary education. Two cases are presented: a social studies methods course in a teacher education program in the South of the United States; and a senior undergraduate seminar on global violence at a university in southern Ontario, Canada.…

  1. An Evaluation of the Influence of Case Method Instruction on the Reflective Thinking of MSW Students

    ERIC Educational Resources Information Center

    Milner, Marleen

    2009-01-01

    Social work practice requires that graduates be prepared to deal with complex, multifaceted problems which cannot be defined completely, do not have absolute, correct answers and can be approached from multiple perspectives. This study evaluated the influence of case-based instruction on MSW students' reflective judgment, an aspect of critical…

  2. Project-Based Learning in Education: Integrating Business Needs and Student Learning

    ERIC Educational Resources Information Center

    Cho, Yonjoo; Brown, Catherine

    2013-01-01

    Purpose: The purpose of this case study was to investigate how project-based learning (PBL) is being practiced in Columbus Signature Academy (CSA), a high school located in Columbus, Indiana, USA. Design/methodology/approach: The authors used the case study method to provide qualitative details about CSA's use of PBL that is being practiced in a…

  3. Technology and Multiculturalism in the Classroom: Case Studies in Attitudes and Motivations.

    ERIC Educational Resources Information Center

    Chisholm, Ines Marquez; Wetzel, Keith

    2001-01-01

    Uses a case study approach in examining the attitudes and motivations of five teacher educators who used technology in their classroom. Identifies and discusses six common elements of multicultural technology integration and concludes with a general discussion on the need to combine pedagogical methods with a practical vision of technology use and…

  4. Urban Principal's Perception of Instructional Coaching as Job-Embedded Professional Development: A Case Study

    ERIC Educational Resources Information Center

    Marcks, Melissa A.

    2017-01-01

    Instructional coaching is one method of job-embedded professional development approach that provides teachers an opportunity to build teacher expertise, raise student achievement, and advance school reform. The problem that was addressed in this qualitative case study was that few principals' understand the process of instructional coaching as…

  5. Universities: Can They Be Considered as Learning Organizations?: A Preliminary Micro-Level Perspective

    ERIC Educational Resources Information Center

    Bak, Ozlem

    2012-01-01

    Purpose: The purpose of this study is to explore a department in a UK higher education (HE) institute based on Senge's five characteristics of learning organizations. Design/methodology/approach: In this study, a case study method was utilized. The case study entailed two lines of enquiry: a questionnaire, which was distributed to a UK…

  6. Building a Dangerous Outpost in the Green Mountain State: A Case Study of Educator Preparation Policymaking

    ERIC Educational Resources Information Center

    McGough, David J.; Bedell, Claudine; Tinkler, Barri

    2018-01-01

    Poised at a bifurcation, the educator preparation community in Vermont faced either the adoption of a generic product for the assessment of initial educator licensure candidates or the comprehensive revision of a longstanding state-based assessment portfolio. Using a case study approach and narrative methods, specifically the Narrative Policy…

  7. A "Tale of Two Cities:" A Comparative Case Study of Community Engagement and Costs in Two Levy Campaigns

    ERIC Educational Resources Information Center

    Ingle, W. Kyle; Johnson, Paul A.; Petroff, Ruth Ann

    2011-01-01

    Using Anderson's (1998) framework for authentic community engagement and Levin and McEwan's (2001) "ingredients method," this comparative case study analyzed contrasting approaches to levy campaigns undertaken by two suburban school districts and the associated costs of the campaigns. We found that District A ran a campaign that…

  8. Teaching Law and Theory through Context: Contract Clauses in Legal Studies Education

    ERIC Educational Resources Information Center

    DiMatteo, Larry A.; Anenson, T. Leigh

    2007-01-01

    Business professors in the twenty-first century have been engaging in another form of problem-based pedagogy to unite business school and business practice. This teaching methodology, called "active learning," has become the new case method in college courses. Like the case-based approach, active learning bridges the gap between theory and…

  9. Anticipated educational outcomes: a case study of the outdoor recreation consortium experience

    Treesearch

    Yasong Wang; Alan Graefe

    2008-01-01

    This paper reports on a case study of an outdoor experiential learning program and examines its meaning for program participants. The research was conducted with 56 university students who participated in the Outdoor Recreation Consortium held at the Great Smoky Mountain Institute in Tremont, TN. A mixed-method comparative research approach, using both quantitative and...

  10. Contemplating a New Model for Air Force Aerospace Medical Technician Skills Sustainment Training

    DTIC Science & Technology

    2006-03-01

    qualitative research designs. The major designs described by these researchers included: grounded theory , narrative research ... phenomenological research , ethnographies , content analysis, and case study . Because each of these designs can stand alone as an individual research ...exploratory, embedded, single case study . A mixed methods research approach will be applied in an effort to discover

  11. Extreme weather exposure identification for road networks - a comparative assessment of statistical methods

    NASA Astrophysics Data System (ADS)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    The assessment of road infrastructure exposure to extreme weather events is of major importance for scientists and practitioners alike. In this study, we compare the different extreme value approaches and fitting methods with respect to their value for assessing the exposure of transport networks to extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series (PDS) over the standardly used annual maxima series (AMS) in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing PDS) being superior to the block maxima approach (employing AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was visible from neither the square-root criterion nor standardly used graphical diagnosis (mean residual life plot) but rather from a direct comparison of AMS and PDS in combined quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best-suited approach. This will make the analyses more robust, not only in cases where threshold selection and dependency introduces biases to the PDS approach but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend the use of conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of the study directly address road and traffic management but can be transferred to a range of other environmental variables including meteorological and hydrological quantities.

  12. A simple method to equalize the workload when operating several small wastewater treatment plants: a case study.

    PubMed

    De Feo, G; De Gisi, S; Galasso, M

    2013-01-01

    The aim of the present study is to define a simple (and easy to use) method to equalize the workload of personnel operating several small wastewater treatment plants (SWWTPs). The approach is illustrated through a case study which is the result of collaboration between researchers and a water and wastewater management company operating in Southern Italy. The topic is important since personnel have a significant impact on the operating costs of SWWTPs, and the approach outlined results in the minimum number of staff being required to assure the management of the service. Four kinds of work units are considered: plant managers, assistant plant managers, laboratory technicians and executives. In order to develop a practical, feasible and easy to use method, the workload was evaluated considering only the population equivalent (PE) and the number of plants managed. The core of the method is the evaluation of the percentage of time that the personnel units devote to the operation of SWWTPs of the municipality considered. The proposed procedure offers a useful tool to equalize the workload, both in terms of PE and the number of plants managed, the procedure being easily modifiable to introduce other evaluation criteria. By using familiar concepts such as PE and number of plants managed, the approach of the method can easily be understood by management. It can also be readily adapted to other similar situations.

  13. Microvascular Decompression for Classical Trigeminal Neuralgia Caused by Venous Compression: Novel Anatomic Classifications and Surgical Strategy.

    PubMed

    Wu, Min; Fu, Xianming; Ji, Ying; Ding, Wanhai; Deng, Dali; Wang, Yehan; Jiang, Xiaofeng; Niu, Chaoshi

    2018-05-01

    Microvascular decompression of the trigeminal nerve is the most effective treatment for trigeminal neuralgia. However, when encountering classical trigeminal neuralgia caused by venous compression, the procedure becomes much more difficult, and failure or recurrence because of incomplete decompression may become frequent. This study aimed to investigate the anatomic variation of the culprit veins and discuss the surgical strategy for different types. We performed a retrospective analysis of 64 consecutive cases in whom veins were considered as responsible vessels alone or combined with other adjacent arteries. The study classified culprit veins according to operative anatomy and designed personalized approaches and decompression management according to different forms of compressive veins. Curative effects were assessed by the Barrow Neurological Institute (BNI) pain intensity score and BNI facial numbness score. The most commonly encountered veins were the superior petrosal venous complex (SPVC), which was artificially divided into 4 types according to both venous tributary distribution and empty point site. We synthetically considered these factors and selected an approach to expose the trigeminal root entry zone, including the suprafloccular transhorizontal fissure approach and infratentorial supracerebellar approach. The methods of decompression consist of interposing and transposing by using Teflon, and sometimes with the aid of medical adhesive. Nerve combing (NC) of the trigeminal root was conducted in situations of extremely difficult neurovascular compression, instead of sacrificing veins. Pain completely disappeared in 51 patients, and the excellent outcome rate was 79.7%. There were 13 patients with pain relief treated with reoperation. Postoperative complications included 10 cases of facial numbness, 1 case of intracranial infection, and 1 case of high-frequency hearing loss. The accuracy recognition of anatomic variation of the SPVC is crucial for the management of classical trigeminal neuralgia caused by venous compression. Selecting an appropriate approach and using reasonable decompression methods can bring complete postoperative pain relief for most cases. NC can be an alternative choice for extremely difficult cases, but it could lead to facial numbness more frequently. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Bias correction of risk estimates in vaccine safety studies with rare adverse events using a self-controlled case series design.

    PubMed

    Zeng, Chan; Newcomer, Sophia R; Glanz, Jason M; Shoup, Jo Ann; Daley, Matthew F; Hambidge, Simon J; Xu, Stanley

    2013-12-15

    The self-controlled case series (SCCS) method is often used to examine the temporal association between vaccination and adverse events using only data from patients who experienced such events. Conditional Poisson regression models are used to estimate incidence rate ratios, and these models perform well with large or medium-sized case samples. However, in some vaccine safety studies, the adverse events studied are rare and the maximum likelihood estimates may be biased. Several bias correction methods have been examined in case-control studies using conditional logistic regression, but none of these methods have been evaluated in studies using the SCCS design. In this study, we used simulations to evaluate 2 bias correction approaches-the Firth penalized maximum likelihood method and Cordeiro and McCullagh's bias reduction after maximum likelihood estimation-with small sample sizes in studies using the SCCS design. The simulations showed that the bias under the SCCS design with a small number of cases can be large and is also sensitive to a short risk period. The Firth correction method provides finite and less biased estimates than the maximum likelihood method and Cordeiro and McCullagh's method. However, limitations still exist when the risk period in the SCCS design is short relative to the entire observation period.

  15. Discover binding pathways using the sliding binding-box docking approach: application to binding pathways of oseltamivir to avian influenza H5N1 neuraminidase

    NASA Astrophysics Data System (ADS)

    Tran, Diem-Trang T.; Le, Ly T.; Truong, Thanh N.

    2013-08-01

    Drug binding and unbinding are transient processes which are hardly observed by experiment and difficult to analyze by computational techniques. In this paper, we employed a cost-effective method called "pathway docking" in which molecular docking was used to screen ligand-receptor binding free energy surface to reveal possible paths of ligand approaching protein binding pocket. A case study was applied on oseltamivir, the key drug against influenza a virus. The equilibrium pathways identified by this method are found to be similar to those identified in prior studies using highly expensive computational approaches.

  16. Data-driven approaches in the investigation of social perception

    PubMed Central

    Adolphs, Ralph; Nummenmaa, Lauri; Todorov, Alexander; Haxby, James V.

    2016-01-01

    The complexity of social perception poses a challenge to traditional approaches to understand its psychological and neurobiological underpinnings. Data-driven methods are particularly well suited to tackling the often high-dimensional nature of stimulus spaces and of neural representations that characterize social perception. Such methods are more exploratory, capitalize on rich and large datasets, and attempt to discover patterns often without strict hypothesis testing. We present four case studies here: behavioural studies on face judgements, two neuroimaging studies of movies, and eyetracking studies in autism. We conclude with suggestions for particular topics that seem ripe for data-driven approaches, as well as caveats and limitations. PMID:27069045

  17. Random Versus Nonrandom Peer Review: A Case for More Meaningful Peer Review.

    PubMed

    Itri, Jason N; Donithan, Adam; Patel, Sohil H

    2018-05-10

    Random peer review programs are not optimized to discover cases with diagnostic error and thus have inherent limitations with respect to educational and quality improvement value. Nonrandom peer review offers an alternative approach in which diagnostic error cases are targeted for collection during routine clinical practice. The objective of this study was to compare error cases identified through random and nonrandom peer review approaches at an academic center. During the 1-year study period, the number of discrepancy cases and score of discrepancy were determined from each approach. The nonrandom peer review process collected 190 cases, of which 60 were scored as 2 (minor discrepancy), 94 as 3 (significant discrepancy), and 36 as 4 (major discrepancy). In the random peer review process, 1,690 cases were reviewed, of which 1,646 were scored as 1 (no discrepancy), 44 were scored as 2 (minor discrepancy), and none were scored as 3 or 4. Several teaching lessons and quality improvement measures were developed as a result of analysis of error cases collected through the nonrandom peer review process. Our experience supports the implementation of nonrandom peer review as a replacement to random peer review, with nonrandom peer review serving as a more effective method for collecting diagnostic error cases with educational and quality improvement value. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  18. Proposing a New Framework and an Innovative Approach to Teaching Reengineering and ERP Implementation Concepts

    ERIC Educational Resources Information Center

    Pellerin, Robert; Hadaya, Pierre

    2008-01-01

    Recognizing the need to teach ERP implementation and business process reengineering (BPR) concepts simultaneously, as well as the pedagogical limitations of the case teaching method and simulation tools, the objective of this study is to propose a new framework and an innovative teaching approach to improve the ERP training experience for IS…

  19. The Application of a Resilience Assessment Approach to Promote Campus Environmental Management: A South African Case Study

    ERIC Educational Resources Information Center

    Muller, Irene; Tempelhoff, Johann

    2016-01-01

    Purpose: This paper aims to outline the benefits of using resilience assessment instead of command and control mechanisms to evaluate sustainable campus environments. Design/Methodology/Approach: An exploratory mixed-method design was followed for the purposes of the project. During the first qualitative phase, a historical timeline of the focal…

  20. Open Experimentation on Phenomena of Chemical Reactions via the Learning Company Approach in Early Secondary Chemistry Education

    ERIC Educational Resources Information Center

    Beck, Katharina; Witteck, Torsten; Eilks, Ingo

    2010-01-01

    Presented is a case study on the implementation of open and inquiry-type experimentation in early German secondary chemistry education. The teaching strategy discussed follows the learning company approach. Originally adopted from vocational education, the learning company method is used to redirect lab-oriented classroom practice towards a more…

  1. "Academic Strategy: The Management Revolution in American Higher Education," by George Keller (1983) Can Strategy Work in Higher Education?

    ERIC Educational Resources Information Center

    Temple, Paul

    2018-01-01

    Keller's book was one of the first works to suggest strategic approaches to the management of higher education institutions. His case study method proved popular with readers. However, the limitations of his approach to strategy grew more apparent over time, although many of his insights remain valid today.

  2. A Theoretical Framework for Integrating Creativity Development into Curriculum: The Case of a Korean Engineering School

    ERIC Educational Resources Information Center

    Lim, Cheolil; Lee, Jihyun; Lee, Sunhee

    2014-01-01

    Existing approaches to developing creativity rely on the sporadic teaching of creative thinking techniques or the engagement of learners in a creativity-promoting environment. Such methods cannot develop students' creativity as fully as a multilateral approach that integrates creativity throughout a curriculum. The purpose of this study was to…

  3. Improving ethical knowledge and sensemaking from cases through elaborative interrogation and outcome valence.

    PubMed

    Johnson, James F; Bagdasarov, Zhanna; MacDougall, Alexandra E; Steele, Logan; Connelly, Shane; Devenport, Lynn D; Mumford, Michael D

    2014-01-01

    The case-based approach to learning is popular among many applied fields. However, results of case-based education vary widely on case content and case presentation. This study examined two aspects of case-based education-outcome valence and case elaboration methods-in a two-day case-based Responsible Conduct of Research (RCR) ethics education program. Results suggest that outcome information is an integral part of a quality case. Furthermore, valence consistent outcomes may have certain advantages over mixed valence outcome information. Finally, students enjoy and excel working with case material, and the use of elaborative interrogation techniques can significantly improve internally-focused ethical sensemaking strategies associated with personal biases, constraints, and emotions.

  4. A novel quality by design approach for developing an HPLC method to analyze herbal extracts: A case study of sugar content analysis.

    PubMed

    Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu

    2018-01-01

    The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.

  5. Design-order, non-conformal low-Mach fluid algorithms using a hybrid CVFEM/DG approach

    NASA Astrophysics Data System (ADS)

    Domino, Stefan P.

    2018-04-01

    A hybrid, design-order sliding mesh algorithm, which uses a control volume finite element method (CVFEM), in conjunction with a discontinuous Galerkin (DG) approach at non-conformal interfaces, is outlined in the context of a low-Mach fluid dynamics equation set. This novel hybrid DG approach is also demonstrated to be compatible with a classic edge-based vertex centered (EBVC) scheme. For the CVFEM, element polynomial, P, promotion is used to extend the low-order P = 1 CVFEM method to higher-order, i.e., P = 2. An equal-order low-Mach pressure-stabilized methodology, with emphasis on the non-conformal interface boundary condition, is presented. A fully implicit matrix solver approach that accounts for the full stencil connectivity across the non-conformal interface is employed. A complete suite of formal verification studies using the method of manufactured solutions (MMS) is performed to verify the order of accuracy of the underlying methodology. The chosen suite of analytical verification cases range from a simple steady diffusion system to a traveling viscous vortex across mixed-order non-conformal interfaces. Results from all verification studies demonstrate either second- or third-order spatial accuracy and, for transient solutions, second-order temporal accuracy. Significant accuracy gains in manufactured solution error norms are noted even with modest promotion of the underlying polynomial order. The paper also demonstrates the CVFEM/DG methodology on two production-like simulation cases that include an inner block subjected to solid rotation, i.e., each of the simulations include a sliding mesh, non-conformal interface. The first production case presented is a turbulent flow past a high-rate-of-rotation cube (Re, 4000; RPM, 3600) on like and mixed-order polynomial interfaces. The final simulation case is a full-scale Vestas V27 225 kW wind turbine (tower and nacelle omitted) in which a hybrid topology, low-order mesh is used. Both production simulations provide confidence in the underlying capability and demonstrate the viability of this hybrid method for deployment towards high-fidelity wind energy validation and analysis.

  6. "It Was My Understanding That There Would Be No Math": Using Thematic Cases to Teach Undergraduate Research Methods

    ERIC Educational Resources Information Center

    Oldmixon, Elizabeth A.

    2018-01-01

    Undergraduates frequently approach research methods classes with trepidation and skepticism, owing in part to math-phobia and confusion over how methodology is relevant to their interests. These self-defeating barriers to learning undermine the efficacy of methods classes. This essay discusses a strategy for overcoming these barriers--use of a…

  7. Modeling dust growth in protoplanetary disks: The breakthrough case

    NASA Astrophysics Data System (ADS)

    Drążkowska, J.; Windmark, F.; Dullemond, C. P.

    2014-07-01

    Context. Dust coagulation in protoplanetary disks is one of the initial steps toward planet formation. Simple toy models are often not sufficient to cover the complexity of the coagulation process, and a number of numerical approaches are therefore used, among which integration of the Smoluchowski equation and various versions of the Monte Carlo algorithm are the most popular. Aims: Recent progress in understanding the processes involved in dust coagulation have caused a need for benchmarking and comparison of various physical aspects of the coagulation process. In this paper, we directly compare the Smoluchowski and Monte Carlo approaches to show their advantages and disadvantages. Methods: We focus on the mechanism of planetesimal formation via sweep-up growth, which is a new and important aspect of the current planet formation theory. We use realistic test cases that implement a distribution in dust collision velocities. This allows a single collision between two grains to have a wide range of possible outcomes but also requires a very high numerical accuracy. Results: For most coagulation problems, we find a general agreement between the two approaches. However, for the sweep-up growth driven by the "lucky" breakthrough mechanism, the methods exhibit very different resolution dependencies. With too few mass bins, the Smoluchowski algorithm tends to overestimate the growth rate and the probability of breakthrough. The Monte Carlo method is less dependent on the number of particles in the growth timescale aspect but tends to underestimate the breakthrough chance due to its limited dynamic mass range. Conclusions: We find that the Smoluchowski approach, which is generally better for the breakthrough studies, is sensitive to low mass resolutions in the high-mass, low-number tail that is important in this scenario. To study the low number density features, a new modulation function has to be introduced to the interaction probabilities. As the minimum resolution needed for breakthrough studies depends strongly on setup, verification has to be performed on a case by case basis.

  8. A generalized least squares regression approach for computing effect sizes in single-case research: application examples.

    PubMed

    Maggin, Daniel M; Swaminathan, Hariharan; Rogers, Helen J; O'Keeffe, Breda V; Sugai, George; Horner, Robert H

    2011-06-01

    A new method for deriving effect sizes from single-case designs is proposed. The strategy is applicable to small-sample time-series data with autoregressive errors. The method uses Generalized Least Squares (GLS) to model the autocorrelation of the data and estimate regression parameters to produce an effect size that represents the magnitude of treatment effect from baseline to treatment phases in standard deviation units. In this paper, the method is applied to two published examples using common single case designs (i.e., withdrawal and multiple-baseline). The results from these studies are described, and the method is compared to ten desirable criteria for single-case effect sizes. Based on the results of this application, we conclude with observations about the use of GLS as a support to visual analysis, provide recommendations for future research, and describe implications for practice. Copyright © 2011 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  9. Application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) of the rare earth elements (REEs) in beneficiation rare earth waste from the gold processing: case study

    NASA Astrophysics Data System (ADS)

    Bieda, Bogusław; Grzesik, Katarzyna

    2017-11-01

    The study proposes an stochastic approach based on Monte Carlo (MC) simulation for life cycle assessment (LCA) method limited to life cycle inventory (LCI) study for rare earth elements (REEs) recovery from the secondary materials processes production applied to the New Krankberg Mine in Sweden. The MC method is recognizes as an important tool in science and can be considered the most effective quantification approach for uncertainties. The use of stochastic approach helps to characterize the uncertainties better than deterministic method. Uncertainty of data can be expressed through a definition of probability distribution of that data (e.g. through standard deviation or variance). The data used in this study are obtained from: (i) site-specific measured or calculated data, (ii) values based on literature, (iii) the ecoinvent process "rare earth concentrate, 70% REO, from bastnäsite, at beneficiation". Environmental emissions (e.g, particulates, uranium-238, thorium-232), energy and REE (La, Ce, Nd, Pr, Sm, Dy, Eu, Tb, Y, Sc, Yb, Lu, Tm, Y, Gd) have been inventoried. The study is based on a reference case for the year 2016. The combination of MC analysis with sensitivity analysis is the best solution for quantified the uncertainty in the LCI/LCA. The reliability of LCA results may be uncertain, to a certain degree, but this uncertainty can be noticed with the help of MC method.

  10. The case-only test for gene-environment interaction is not uniformly powerful: an empirical example

    PubMed Central

    Wu, Chen; Chang, Jiang; Ma, Baoshan; Miao, Xiaoping; Zhou, Yifeng; Liu, Yu; Li, Yun; Wu, Tangchun; Hu, Zhibin; Shen, Hongbing; Jia, Weihua; Zeng, Yixin; Lin, Dongxin; Kraft, Peter

    2016-01-01

    The case-only test has been proposed as a more powerful approach to detect gene-environment (G×E) interactions. This approach assumes that the genetic and environmental factors are independent. While it is well known that Type I error rate will increase if this assumption is violated, it is less widely appreciated that gene-environment correlation can also lead to power loss. We illustrate this phenomenon by comparing the performance of the case-only test to other approaches to detect G×E interactions in a genome-wide association study of esophageal squamous carcinoma (ESCC) in Chinese populations. Some of these approaches do not use information on the correlation between exposure and genotype (standard logistic regression), while others seek to use this information in a robust fashion to boost power without increasing Type I error (two-step, empirical Bayes and cocktail methods). G×E interactions were identified involving drinking status and two regions containing genes in the alcohol metabolism pathway, 4q23 and 12q24. Although the case-only test yielded the most significant tests of G×E interaction in the 4q23 region, the case-only test failed to identify significant interactions in the 12q24 region which were readily identified using other approaches. The low power of the case-only test in the 12q24 region is likely due to the strong inverse association between the SNPs in this region and drinking status. This example underscores the need to consider multiple approaches to detect gene-environment interactions, as different tests are more or less sensitive to different alternative hypotheses and violations of the gene-environment independence assumption. PMID:23595356

  11. Automatic detection of wheezes by evaluation of multiple acoustic feature extraction methods and C-weighted SVM

    NASA Astrophysics Data System (ADS)

    Sosa, Germán. D.; Cruz-Roa, Angel; González, Fabio A.

    2015-01-01

    This work addresses the problem of lung sound classification, in particular, the problem of distinguishing between wheeze and normal sounds. Wheezing sound detection is an important step to associate lung sounds with an abnormal state of the respiratory system, usually associated with tuberculosis or another chronic obstructive pulmonary diseases (COPD). The paper presents an approach for automatic lung sound classification, which uses different state-of-the-art sound features in combination with a C-weighted support vector machine (SVM) classifier that works better for unbalanced data. Feature extraction methods used here are commonly applied in speech recognition and related problems thanks to the fact that they capture the most informative spectral content from the original signals. The evaluated methods were: Fourier transform (FT), wavelet decomposition using Wavelet Packet Transform bank of filters (WPT) and Mel Frequency Cepstral Coefficients (MFCC). For comparison, we evaluated and contrasted the proposed approach against previous works using different combination of features and/or classifiers. The different methods were evaluated on a set of lung sounds including normal and wheezing sounds. A leave-two-out per-case cross-validation approach was used, which, in each fold, chooses as validation set a couple of cases, one including normal sounds and the other including wheezing sounds. Experimental results were reported in terms of traditional classification performance measures: sensitivity, specificity and balanced accuracy. Our best results using the suggested approach, C-weighted SVM and MFCC, achieve a 82.1% of balanced accuracy obtaining the best result for this problem until now. These results suggest that supervised classifiers based on kernel methods are able to learn better models for this challenging classification problem even using the same feature extraction methods.

  12. Experimental evaluation of the certification-trail method

    NASA Technical Reports Server (NTRS)

    Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.; Itoh, Mamoru; Smith, Warren W.; Kay, Jonathan S.

    1993-01-01

    Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. A comprehensive attempt to assess experimentally the performance and overall value of the method is reported. The method is applied to algorithms for the following problems: huffman tree, shortest path, minimum spanning tree, sorting, and convex hull. Our results reveal many cases in which an approach using certification-trails allows for significantly faster overall program execution time than a basic time redundancy-approach. Algorithms for the answer-validation problem for abstract data types were also examined. This kind of problem provides a basis for applying the certification-trail method to wide classes of algorithms. Answer-validation solutions for two types of priority queues were implemented and analyzed. In both cases, the algorithm which performs answer-validation is substantially faster than the original algorithm for computing the answer. Next, a probabilistic model and analysis which enables comparison between the certification-trail method and the time-redundancy approach were presented. The analysis reveals some substantial and sometimes surprising advantages for ther certification-trail method. Finally, the work our group performed on the design and implementation of fault injection testbeds for experimental analysis of the certification trail technique is discussed. This work employs two distinct methodologies, software fault injection (modification of instruction, data, and stack segments of programs on a Sun Sparcstation ELC and on an IBM 386 PC) and hardware fault injection (control, address, and data lines of a Motorola MC68000-based target system pulsed at logical zero/one values). Our results indicate the viability of the certification trail technique. It is also believed that the tools developed provide a solid base for additional exploration.

  13. Herbal hepatotoxicity: Challenges and pitfalls of causality assessment methods

    PubMed Central

    Teschke, Rolf; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2013-01-01

    The diagnosis of herbal hepatotoxicity or herb induced liver injury (HILI) represents a particular clinical and regulatory challenge with major pitfalls for the causality evaluation. At the day HILI is suspected in a patient, physicians should start assessing the quality of the used herbal product, optimizing the clinical data for completeness, and applying the Council for International Organizations of Medical Sciences (CIOMS) scale for initial causality assessment. This scale is structured, quantitative, liver specific, and validated for hepatotoxicity cases. Its items provide individual scores, which together yield causality levels of highly probable, probable, possible, unlikely, and excluded. After completion by additional information including raw data, this scale with all items should be reported to regulatory agencies and manufacturers for further evaluation. The CIOMS scale is preferred as tool for assessing causality in hepatotoxicity cases, compared to numerous other causality assessment methods, which are inferior on various grounds. Among these disputed methods are the Maria and Victorino scale, an insufficiently qualified, shortened version of the CIOMS scale, as well as various liver unspecific methods such as the ad hoc causality approach, the Naranjo scale, the World Health Organization (WHO) method, and the Karch and Lasagna method. An expert panel is required for the Drug Induced Liver Injury Network method, the WHO method, and other approaches based on expert opinion, which provide retrospective analyses with a long delay and thereby prevent a timely assessment of the illness in question by the physician. In conclusion, HILI causality assessment is challenging and is best achieved by the liver specific CIOMS scale, avoiding pitfalls commonly observed with other approaches. PMID:23704820

  14. Determination of lung segments in computed tomography images using the Euclidean distance to the pulmonary artery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoecker, Christina; Moltz, Jan H.; Lassen, Bianca

    Purpose: Computed tomography (CT) imaging is the modality of choice for lung cancer diagnostics. With the increasing number of lung interventions on sublobar level in recent years, determining and visualizing pulmonary segments in CT images and, in oncological cases, reliable segment-related information about the location of tumors has become increasingly desirable. Computer-assisted identification of lung segments in CT images is subject of this work.Methods: The authors present a new interactive approach for the segmentation of lung segments that uses the Euclidean distance of each point in the lung to the segmental branches of the pulmonary artery. The aim is tomore » analyze the potential of the method. Detailed manual pulmonary artery segmentations are used to achieve the best possible segment approximation results. A detailed description of the method and its evaluation on 11 CT scans from clinical routine are given.Results: An accuracy of 2–3 mm is measured for the segment boundaries computed by the pulmonary artery-based method. On average, maximum deviations of 8 mm are observed. 135 intersegmental pulmonary veins detected in the 11 test CT scans serve as reference data. Furthermore, a comparison of the presented pulmonary artery-based approach to a similar approach that uses the Euclidean distance to the segmental branches of the bronchial tree is presented. It shows a significantly higher accuracy for the pulmonary artery-based approach in lung regions at least 30 mm distal to the lung hilum.Conclusions: A pulmonary artery-based determination of lung segments in CT images is promising. In the tests, the pulmonary artery-based determination has been shown to be superior to the bronchial tree-based determination. The suitability of the segment approximation method for application in the planning of segment resections in clinical practice has already been verified in experimental cases. However, automation of the method accompanied by an evaluation on a larger number of test cases is required before application in the daily clinical routine.« less

  15. Determining the Applicability of Threshold of Toxicological Concern Approaches to Substances Found in Foods

    PubMed Central

    Canady, Richard; Lane, Richard; Paoli, Greg; Wilson, Margaret; Bialk, Heidi; Hermansky, Steven; Kobielush, Brent; Lee, Ji-Eun; Llewellyn, Craig; Scimeca, Joseph

    2013-01-01

    Threshold of Toxicological Concern (TTC) decision-support methods present a pragmatic approach to using data from well-characterized chemicals and protective estimates of exposure in a stepwise fashion to inform decisions regarding low-level exposures to chemicals for which few data exist. It is based on structural and functional categorizations of chemicals derived from decades of animal testing with a wide variety of chemicals. Expertise is required to use the TTC methods, and there are situations in which its use is clearly inappropriate or not currently supported. To facilitate proper use of the TTC, this paper describes issues to be considered by risk managers when faced with the situation of an unexpected substance in food. Case studies are provided to illustrate the implementation of these considerations, demonstrating the steps taken in deciding whether it would be appropriate to apply the TTC approach in each case. By appropriately applying the methods, employing the appropriate scientific expertise, and combining use with the conservative assumptions embedded within the derivation of the thresholds, the TTC can realize its potential to protect public health and to contribute to efficient use of resources in food safety risk management. PMID:24090142

  16. Comparison of shade matching by visual observation and an intraoral dental colorimeter.

    PubMed

    Li, Q; Wang, Y N

    2007-11-01

    The purpose of this study was to compare the applicability of two shade-matching approaches: Vintage Halo shade guide (visual method) and Shofu ShadeEye NCC colorimeter (instrumental method). Twenty participants' maxillary left central incisors were evaluated. Corresponding metal ceramic crowns were fabricated with each shade-matching approach. The colour distributions (L*, a* and b*) of the middle third region of each tooth and corresponding metal ceramic crowns were spectrophotometrically assessed. The colour difference (DeltaE) and colour distributions (DeltaL*, Deltaa* and Deltab*) between the tooth and the corresponding crowns were calculated. We found that the colour differences of both groups fell within the clinical unacceptable range (DeltaE > 2.75). Regarding DeltaE and the three colour distributions, no significant difference was found, expect for a* (P < 0.01). The shade matching difficulty degree was analysed through the agreements of visual shade selection. Within easy matching cases, the instrumental method achieved better results (P = 0.041). In conclusion, it is suggested that the reliability of shade matching can be ensured by neither the colorimeter nor the visual approach. However, the colorimeter can achieve better results within easy matching cases.

  17. Evaluation of uncertainty for regularized deconvolution: A case study in hydrophone measurements.

    PubMed

    Eichstädt, S; Wilkens, V

    2017-06-01

    An estimation of the measurand in dynamic metrology usually requires a deconvolution based on a dynamic calibration of the measuring system. Since deconvolution is, mathematically speaking, an ill-posed inverse problem, some kind of regularization is required to render the problem stable and obtain usable results. Many approaches to regularized deconvolution exist in the literature, but the corresponding evaluation of measurement uncertainties is, in general, an unsolved issue. In particular, the uncertainty contribution of the regularization itself is a topic of great importance, because it has a significant impact on the estimation result. Here, a versatile approach is proposed to express prior knowledge about the measurand based on a flexible, low-dimensional modeling of an upper bound on the magnitude spectrum of the measurand. This upper bound allows the derivation of an uncertainty associated with the regularization method in line with the guidelines in metrology. As a case study for the proposed method, hydrophone measurements in medical ultrasound with an acoustic working frequency of up to 7.5 MHz are considered, but the approach is applicable for all kinds of estimation methods in dynamic metrology, where regularization is required and which can be expressed as a multiplication in the frequency domain.

  18. Development and application of the modal space self-tuning regulator

    NASA Astrophysics Data System (ADS)

    Schultze, John Francis

    The control and reduction of vibration of flexible structures is currently an area of much research and concern in the aerospace and automotive industries. Often these systems are idealized as discrete systems with a finite number of degrees of freedom. Traditional active control approaches have attempted either to identify the complete system and design an appropriate controller or; use an ad-hoc set of single degree of freedom controllers. Both methods have limitations. The former requires great computational and control design effort. This approach also attempts to reduce the vibration across the complete spectrum as opposed to applying control effort only to the problematic mode(s). The latter method is often limited by its inability to address the structural coupling inherent in these systems. The Modal Space Self Tuning Regulator (MSSTR) method proposed in this research addresses both of these problems as well as changes in the structural properties of a system. The control problem is approached in a two stage effort, decoupling and adaptive control. The structure's motion is decoupled through the Modified Reciprocal Modal Vector method. The control is then implemented in modal space as a new acceleration feedback based, single degree of freedom, form of the Self Tuning Regulator. The range of application of this controller in terms of maximum additive damping, actuator location sensitivity, and discrete and continuous system mass changes are investigated. Also, the behavior of the internal controller parameters are studied for the extension of this method to system monitoring and damage detection. Proof of the numeric stability of the controller in the ideal case is presented as well as its practical implementation issues. This control approach was shown to be effective for the cases of specified damping increases up to 10 dB, several actuator locations, three discrete mass perturbations and several continuous mass change cases. There appears to be little dependence on the actuator position until the additive damping limit is reached. The discrete mass change tests investigate both increases and reductions in the effective moving mass of the system. The controller performed well in all cases investigated achieving a minimum of 7 dB and up to 15 dB of attenuation. The continuous mass change cases, modeling tool-wear, fuel consumption, or other time varying phenomena, show good convergence behavior of the system model and the accompanying regulator law parameters. This validates the controller for its implementation in a rapidly changing system. The MSSTR performed well in several varied test cases, showing both insensitivity to actuator location and resilience to changing system parameters. Extensions to multi-input, multi-mode control appears within ready grasp.

  19. A statistical approach to nuclear fuel design and performance

    NASA Astrophysics Data System (ADS)

    Cunning, Travis Andrew

    As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance, with an average sensitivity index of 48.93% on key output quantities. Pellet grain size and dish depth are also significant contributors, at 31.53% and 13.46%, respectively. A traditional limit of operating envelope case is also evaluated. This case produces output values that exceed the maximum values observed during the 105 Monte Carlo trials for all output quantities of interest. In many cases the difference between the predictions of the two methods is very prominent, and the highly conservative nature of the deterministic approach is demonstrated. A reliability analysis of CANDU fuel manufacturing parametric data, specifically pertaining to the quantification of fuel performance margins, has not been conducted previously. Key Words: CANDU, nuclear fuel, Cameco, fuel manufacturing, fuel modelling, fuel performance, fuel reliability, ELESTRES, ELOCA, dimensional reduction methods, global sensitivity analysis, deterministic safety analysis, probabilistic safety analysis.

  20. Case-control geographic clustering for residential histories accounting for risk factors and covariates.

    PubMed

    Jacquez, Geoffrey M; Meliker, Jaymie R; Avruskin, Gillian A; Goovaerts, Pierre; Kaufmann, Andy; Wilson, Mark L; Nriagu, Jerome

    2006-08-03

    Methods for analyzing space-time variation in risk in case-control studies typically ignore residential mobility. We develop an approach for analyzing case-control data for mobile individuals and apply it to study bladder cancer in 11 counties in southeastern Michigan. At this time data collection is incomplete and no inferences should be drawn - we analyze these data to demonstrate the novel methods. Global, local and focused clustering of residential histories for 219 cases and 437 controls is quantified using time-dependent nearest neighbor relationships. Business address histories for 268 industries that release known or suspected bladder cancer carcinogens are analyzed. A logistic model accounting for smoking, gender, age, race and education specifies the probability of being a case, and is incorporated into the cluster randomization procedures. Sensitivity of clustering to definition of the proximity metric is assessed for 1 to 75 k nearest neighbors. Global clustering is partly explained by the covariates but remains statistically significant at 12 of the 14 levels of k considered. After accounting for the covariates 26 Local clusters are found in Lapeer, Ingham, Oakland and Jackson counties, with the clusters in Ingham and Oakland counties appearing in 1950 and persisting to the present. Statistically significant focused clusters are found about the business address histories of 22 industries located in Oakland (19 clusters), Ingham (2) and Jackson (1) counties. Clusters in central and southeastern Oakland County appear in the 1930's and persist to the present day. These methods provide a systematic approach for evaluating a series of increasingly realistic alternative hypotheses regarding the sources of excess risk. So long as selection of cases and controls is population-based and not geographically biased, these tools can provide insights into geographic risk factors that were not specifically assessed in the case-control study design.

  1. Applying a weed risk assessment approach to GM crops.

    PubMed

    Keese, Paul K; Robold, Andrea V; Myers, Ruth C; Weisman, Sarah; Smith, Joe

    2014-12-01

    Current approaches to environmental risk assessment of genetically modified (GM) plants are modelled on chemical risk assessment methods, which have a strong focus on toxicity. There are additional types of harms posed by plants that have been extensively studied by weed scientists and incorporated into weed risk assessment methods. Weed risk assessment uses robust, validated methods that are widely applied to regulatory decision-making about potentially problematic plants. They are designed to encompass a broad variety of plant forms and traits in different environments, and can provide reliable conclusions even with limited data. The knowledge and experience that underpin weed risk assessment can be harnessed for environmental risk assessment of GM plants. A case study illustrates the application of the Australian post-border weed risk assessment approach to a representative GM plant. This approach is a valuable tool to identify potential risks from GM plants.

  2. Free and Forced Vibrations of Thick-Walled Anisotropic Cylindrical Shells

    NASA Astrophysics Data System (ADS)

    Marchuk, A. V.; Gnedash, S. V.; Levkovskii, S. A.

    2017-03-01

    Two approaches to studying the free and forced axisymmetric vibrations of cylindrical shell are proposed. They are based on the three-dimensional theory of elasticity and division of the original cylindrical shell with concentric cross-sectional circles into several coaxial cylindrical shells. One approach uses linear polynomials to approximate functions defined in plan and across the thickness. The other approach also uses linear polynomials to approximate functions defined in plan, but their variation with thickness is described by the analytical solution of a system of differential equations. Both approaches have approximation and arithmetic errors. When determining the natural frequencies by the semi-analytical finite-element method in combination with the divide and conqure method, it is convenient to find the initial frequencies by the finite-element method. The behavior of the shell during free and forced vibrations is analyzed in the case where the loading area is half the shell thickness

  3. An integrative multi-criteria decision making techniques for supplier evaluation problem with its application

    NASA Astrophysics Data System (ADS)

    Fatrias, D.; Kamil, I.; Meilani, D.

    2018-03-01

    Coordinating business operation with suppliers becomes increasingly important to survive and prosper under the dynamic business environment. A good partnership with suppliers not only increase efficiency, but also strengthen corporate competitiveness. Associated with such concern, this study aims to develop a practical approach of multi-criteria supplier evaluation using combined methods of Taguchi loss function (TLF), best-worst method (BWM) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR). A new framework of integrative approach adopting these methods is our main contribution for supplier evaluation in literature. In this integrated approach, a compromised supplier ranking list based on the loss score of suppliers is obtained using efficient steps of a pairwise comparison based decision making process. Implemetation to the case problem with real data from crumb rubber industry shows the usefulness of the proposed approach. Finally, a suitable managerial implication is presented.

  4. Ultrasonic inspection of carbon fiber reinforced plastic by means of sample-recognition methods

    NASA Technical Reports Server (NTRS)

    Bilgram, R.

    1985-01-01

    In the case of carbon fiber reinforced plastic (CFRP), it has not yet been possible to detect nonlocal defects and material degradation related to aging with the aid of nondestructive inspection method. An approach for overcoming difficulties regarding such an inspection involves an extension of the ultrasonic inspection procedure on the basis of a use of signal processing and sample recognition methods. The basic concept involved in this approach is related to the realization that the ultrasonic signal contains information regarding the medium which is not utilized in conventional ultrasonic inspection. However, the analytical study of the phyiscal processes involved is very complex. For this reason, an empirical approach is employed to make use of the information which has not been utilized before. This approach uses reference signals which can be obtained with material specimens of different quality. The implementation of these concepts for the supersonic inspection of CFRP laminates is discussed.

  5. A subgradient approach for constrained binary optimization via quantum adiabatic evolution

    NASA Astrophysics Data System (ADS)

    Karimi, Sahar; Ronagh, Pooya

    2017-08-01

    Outer approximation method has been proposed for solving the Lagrangian dual of a constrained binary quadratic programming problem via quantum adiabatic evolution in the literature. This should be an efficient prescription for solving the Lagrangian dual problem in the presence of an ideally noise-free quantum adiabatic system. However, current implementations of quantum annealing systems demand methods that are efficient at handling possible sources of noise. In this paper, we consider a subgradient method for finding an optimal primal-dual pair for the Lagrangian dual of a constrained binary polynomial programming problem. We then study the quadratic stable set (QSS) problem as a case study. We see that this method applied to the QSS problem can be viewed as an instance-dependent penalty-term approach that avoids large penalty coefficients. Finally, we report our experimental results of using the D-Wave 2X quantum annealer and conclude that our approach helps this quantum processor to succeed more often in solving these problems compared to the usual penalty-term approaches.

  6. Bayesian-information-gap decision theory with an application to CO 2 sequestration

    DOE PAGES

    O'Malley, D.; Vesselinov, V. V.

    2015-09-04

    Decisions related to subsurface engineering problems such as groundwater management, fossil fuel production, and geologic carbon sequestration are frequently challenging because of an overabundance of uncertainties (related to conceptualizations, parameters, observations, etc.). Because of the importance of these problems to agriculture, energy, and the climate (respectively), good decisions that are scientifically defensible must be made despite the uncertainties. We describe a general approach to making decisions for challenging problems such as these in the presence of severe uncertainties that combines probabilistic and non-probabilistic methods. The approach uses Bayesian sampling to assess parametric uncertainty and Information-Gap Decision Theory (IGDT) to addressmore » model inadequacy. The combined approach also resolves an issue that frequently arises when applying Bayesian methods to real-world engineering problems related to the enumeration of possible outcomes. In the case of zero non-probabilistic uncertainty, the method reduces to a Bayesian method. Lastly, to illustrate the approach, we apply it to a site-selection decision for geologic CO 2 sequestration.« less

  7. A two-dimensional spectrum analysis for sedimentation velocity experiments of mixtures with heterogeneity in molecular weight and shape.

    PubMed

    Brookes, Emre; Cao, Weiming; Demeler, Borries

    2010-02-01

    We report a model-independent analysis approach for fitting sedimentation velocity data which permits simultaneous determination of shape and molecular weight distributions for mono- and polydisperse solutions of macromolecules. Our approach allows for heterogeneity in the frictional domain, providing a more faithful description of the experimental data for cases where frictional ratios are not identical for all components. Because of increased accuracy in the frictional properties of each component, our method also provides more reliable molecular weight distributions in the general case. The method is based on a fine grained two-dimensional grid search over s and f/f (0), where the grid is a linear combination of whole boundary models represented by finite element solutions of the Lamm equation with sedimentation and diffusion parameters corresponding to the grid points. A Monte Carlo approach is used to characterize confidence limits for the determined solutes. Computational algorithms addressing the very large memory needs for a fine grained search are discussed. The method is suitable for globally fitting multi-speed experiments, and constraints based on prior knowledge about the experimental system can be imposed. Time- and radially invariant noise can be eliminated. Serial and parallel implementations of the method are presented. We demonstrate with simulated and experimental data of known composition that our method provides superior accuracy and lower variance fits to experimental data compared to other methods in use today, and show that it can be used to identify modes of aggregation and slow polymerization.

  8. An Accurate and Generic Testing Approach to Vehicle Stability Parameters Based on GPS and INS.

    PubMed

    Miao, Zhibin; Zhang, Hongtian; Zhang, Jinzhu

    2015-12-04

    With the development of the vehicle industry, controlling stability has become more and more important. Techniques of evaluating vehicle stability are in high demand. As a common method, usually GPS sensors and INS sensors are applied to measure vehicle stability parameters by fusing data from the two system sensors. Although prior model parameters should be recognized in a Kalman filter, it is usually used to fuse data from multi-sensors. In this paper, a robust, intelligent and precise method to the measurement of vehicle stability is proposed. First, a fuzzy interpolation method is proposed, along with a four-wheel vehicle dynamic model. Second, a two-stage Kalman filter, which fuses the data from GPS and INS, is established. Next, this approach is applied to a case study vehicle to measure yaw rate and sideslip angle. The results show the advantages of the approach. Finally, a simulation and real experiment is made to verify the advantages of this approach. The experimental results showed the merits of this method for measuring vehicle stability, and the approach can meet the design requirements of a vehicle stability controller.

  9. Toward better public health reporting using existing off the shelf approaches: A comparison of alternative cancer detection approaches using plaintext medical data and non-dictionary based feature selection.

    PubMed

    Kasthurirathne, Suranga N; Dixon, Brian E; Gichoya, Judy; Xu, Huiping; Xia, Yuni; Mamlin, Burke; Grannis, Shaun J

    2016-04-01

    Increased adoption of electronic health records has resulted in increased availability of free text clinical data for secondary use. A variety of approaches to obtain actionable information from unstructured free text data exist. These approaches are resource intensive, inherently complex and rely on structured clinical data and dictionary-based approaches. We sought to evaluate the potential to obtain actionable information from free text pathology reports using routinely available tools and approaches that do not depend on dictionary-based approaches. We obtained pathology reports from a large health information exchange and evaluated the capacity to detect cancer cases from these reports using 3 non-dictionary feature selection approaches, 4 feature subset sizes, and 5 clinical decision models: simple logistic regression, naïve bayes, k-nearest neighbor, random forest, and J48 decision tree. The performance of each decision model was evaluated using sensitivity, specificity, accuracy, positive predictive value, and area under the receiver operating characteristics (ROC) curve. Decision models parameterized using automated, informed, and manual feature selection approaches yielded similar results. Furthermore, non-dictionary classification approaches identified cancer cases present in free text reports with evaluation measures approaching and exceeding 80-90% for most metrics. Our methods are feasible and practical approaches for extracting substantial information value from free text medical data, and the results suggest that these methods can perform on par, if not better, than existing dictionary-based approaches. Given that public health agencies are often under-resourced and lack the technical capacity for more complex methodologies, these results represent potentially significant value to the public health field. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Locally adaptive decision in detection of clustered microcalcifications in mammograms.

    PubMed

    Sainz de Cea, María V; Nishikawa, Robert M; Yang, Yongyi

    2018-02-15

    In computer-aided detection or diagnosis of clustered microcalcifications (MCs) in mammograms, the performance often suffers from not only the presence of false positives (FPs) among the detected individual MCs but also large variability in detection accuracy among different cases. To address this issue, we investigate a locally adaptive decision scheme in MC detection by exploiting the noise characteristics in a lesion area. Instead of developing a new MC detector, we propose a decision scheme on how to best decide whether a detected object is an MC or not in the detector output. We formulate the individual MCs as statistical outliers compared to the many noisy detections in a lesion area so as to account for the local image characteristics. To identify the MCs, we first consider a parametric method for outlier detection, the Mahalanobis distance detector, which is based on a multi-dimensional Gaussian distribution on the noisy detections. We also consider a non-parametric method which is based on a stochastic neighbor graph model of the detected objects. We demonstrated the proposed decision approach with two existing MC detectors on a set of 188 full-field digital mammograms (95 cases). The results, evaluated using free response operating characteristic (FROC) analysis, showed a significant improvement in detection accuracy by the proposed outlier decision approach over traditional thresholding (the partial area under the FROC curve increased from 3.95 to 4.25, p-value  <10 -4 ). There was also a reduction in case-to-case variability in detected FPs at a given sensitivity level. The proposed adaptive decision approach could not only reduce the number of FPs in detected MCs but also improve case-to-case consistency in detection.

  11. Treatments of Missing Values in Large National Data Affect Conclusions: The Impact of Multiple Imputation on Arthroplasty Research.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N

    2018-03-01

    Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Locally adaptive decision in detection of clustered microcalcifications in mammograms

    NASA Astrophysics Data System (ADS)

    Sainz de Cea, María V.; Nishikawa, Robert M.; Yang, Yongyi

    2018-02-01

    In computer-aided detection or diagnosis of clustered microcalcifications (MCs) in mammograms, the performance often suffers from not only the presence of false positives (FPs) among the detected individual MCs but also large variability in detection accuracy among different cases. To address this issue, we investigate a locally adaptive decision scheme in MC detection by exploiting the noise characteristics in a lesion area. Instead of developing a new MC detector, we propose a decision scheme on how to best decide whether a detected object is an MC or not in the detector output. We formulate the individual MCs as statistical outliers compared to the many noisy detections in a lesion area so as to account for the local image characteristics. To identify the MCs, we first consider a parametric method for outlier detection, the Mahalanobis distance detector, which is based on a multi-dimensional Gaussian distribution on the noisy detections. We also consider a non-parametric method which is based on a stochastic neighbor graph model of the detected objects. We demonstrated the proposed decision approach with two existing MC detectors on a set of 188 full-field digital mammograms (95 cases). The results, evaluated using free response operating characteristic (FROC) analysis, showed a significant improvement in detection accuracy by the proposed outlier decision approach over traditional thresholding (the partial area under the FROC curve increased from 3.95 to 4.25, p-value  <10-4). There was also a reduction in case-to-case variability in detected FPs at a given sensitivity level. The proposed adaptive decision approach could not only reduce the number of FPs in detected MCs but also improve case-to-case consistency in detection.

  13. Supraorbital keyhole surgery for optic nerve decompression and dura repair.

    PubMed

    Chen, Yuan-Hao; Lin, Shinn-Zong; Chiang, Yung-Hsiao; Ju, Da-Tong; Liu, Ming-Ying; Chen, Guann-Juh

    2004-07-01

    Supraorbital keyhole surgery is a limited surgical procedure with reduced traumatic manipulation of tissue and entailing little time in the opening and closing of wounds. We utilized the approach to treat head injury patients complicated with optic nerve compression and cerebrospinal fluid leakage (CSF). Eleven cases of basal skull fracture complicated with either optic nerve compression and/or CSF leakage were surgically treated at our department from February 1995 to June 1999. Six cases had primary optic nerve compression, four had CSF leakage and one case involved both injuries. Supraorbital craniotomy was carried out using a keyhole-sized burr hole plus a small craniotomy. The size of craniotomy approximated 2 x 3 cm2. The optic nerve was decompressed via removal of the optic canal roof and anterior clinoid process with high-speed drills. The defect of dura was repaired with two pieces of tensa fascia lata that were attached on both sides of the torn dural defect with tissue glue. Seven cases with optic nerve injury included five cases of total blindness and two cases of light perception before operation. Vision improved in four cases. The CSF leakage was stopped successfully in all four cases without complication. As optic nerve compression and CSF leakage are skull base lesions, the supraorbital keyhole surgery constitutes a suitable approach. The supraorbital keyhole surgery allows for an anterior approach to the skull base. This approach also allows the treatment of both CSF leakage and optic nerve compression. Our results indicate that supraorbital keyhole operation is a safe and effective method for preserving or improving vision and attenuating CSF leakage following injury.

  14. Screening of pollution control and clean-up materials for river chemical spills using the multiple case-based reasoning method with a difference-driven revision strategy.

    PubMed

    Liu, Rentao; Jiang, Jiping; Guo, Liang; Shi, Bin; Liu, Jie; Du, Zhaolin; Wang, Peng

    2016-06-01

    In-depth filtering of emergency disposal technology (EDT) and materials has been required in the process of environmental pollution emergency disposal. However, an urgent problem that must be solved is how to quickly and accurately select the most appropriate materials for treating a pollution event from the existing spill control and clean-up materials (SCCM). To meet this need, the following objectives were addressed in this study. First, the material base and a case base for environment pollution emergency disposal were established to build a foundation and provide material for SCCM screening. Second, the multiple case-based reasoning model method with a difference-driven revision strategy (DDRS-MCBR) was applied to improve the original dual case-based reasoning model method system, and screening and decision-making was performed for SCCM using this model. Third, an actual environmental pollution accident from 2012 was used as a case study to verify the material base, case base, and screening model. The results demonstrated that the DDRS-MCBR method was fast, efficient, and practical. The DDRS-MCBR method changes the passive situation in which the choice of SCCM screening depends only on the subjective experience of the decision maker and offers a new approach to screening SCCM.

  15. Robust Flutter Margin Analysis that Incorporates Flight Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Martin J.

    1998-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  16. Dynamical self-arrest in symmetric and asymmetric diblock copolymer melts using a replica approach within a local theory.

    PubMed

    Wu, Sangwook

    2009-03-01

    We investigate dynamical self-arrest in a diblock copolymer melt using a replica approach within a self-consistent local method based on dynamical mean-field theory (DMFT). The local replica approach effectively predicts (chiN)_{A} for dynamical self-arrest in a block copolymer melt for symmetric and asymmetric cases. We discuss the competition of the cubic and quartic interactions in the Landau free energy for a block copolymer melt in stabilizing a glassy state depending on the chain length. Our local replica theory provides a universal value for the dynamical self-arrest in block copolymer melts with (chiN)_{A} approximately 10.5+64N;{-3/10} for the symmetric case.

  17. Percutaneous treatment of complex biliary stone disease using endourological technique and literature review

    PubMed Central

    Korkes, Fernando; Carneiro, Ariê; Nasser, Felipe; Affonso, Breno Boueri; Galastri, Francisco Leonardo; de Oliveira, Marcos Belotto; Macedo, Antônio Luiz de Vasconcellos

    2015-01-01

    Most biliary stone diseases need to be treated surgically. However, in special cases that traditional biliary tract endoscopic access is not allowed, a multidisciplinary approach using hybrid technique with urologic instrumental constitute a treatment option. We report a case of a patient with complex intrahepatic stones who previously underwent unsuccessful conventional approaches, and who symptoms resolved after treatment with hybrid technique using an endourologic technology. We conducted an extensive literature review until October 2012 of manuscripts indexed in PubMed on the treatment of complex gallstones with hybrid technique. The multidisciplinary approach with hybrid technique using endourologic instrumental represents a safe and effective treatment option for patients with complex biliary stone who cannot conduct treatment with conventional methods. PMID:26061073

  18. Comparing multiple imputation methods for systematically missing subject-level data.

    PubMed

    Kline, David; Andridge, Rebecca; Kaizar, Eloise

    2017-06-01

    When conducting research synthesis, the collection of studies that will be combined often do not measure the same set of variables, which creates missing data. When the studies to combine are longitudinal, missing data can occur on the observation-level (time-varying) or the subject-level (non-time-varying). Traditionally, the focus of missing data methods for longitudinal data has been on missing observation-level variables. In this paper, we focus on missing subject-level variables and compare two multiple imputation approaches: a joint modeling approach and a sequential conditional modeling approach. We find the joint modeling approach to be preferable to the sequential conditional approach, except when the covariance structure of the repeated outcome for each individual has homogenous variance and exchangeable correlation. Specifically, the regression coefficient estimates from an analysis incorporating imputed values based on the sequential conditional method are attenuated and less efficient than those from the joint method. Remarkably, the estimates from the sequential conditional method are often less efficient than a complete case analysis, which, in the context of research synthesis, implies that we lose efficiency by combining studies. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  19. A risk assessment methodology using intuitionistic fuzzy set in FMEA

    NASA Astrophysics Data System (ADS)

    Chang, Kuei-Hu; Cheng, Ching-Hsue

    2010-12-01

    Most current risk assessment methods use the risk priority number (RPN) value to evaluate the risk of failure. However, conventional RPN methodology has been criticised as having five main shortcomings as follows: (1) the assumption that the RPN elements are equally weighted leads to over simplification; (2) the RPN scale itself has some non-intuitive statistical properties; (3) the RPN elements have many duplicate numbers; (4) the RPN is derived from only three factors mainly in terms of safety; and (5) the conventional RPN method has not considered indirect relations between components. To address the above issues, an efficient and comprehensive algorithm to evaluate the risk of failure is needed. This article proposes an innovative approach, which integrates the intuitionistic fuzzy set (IFS) and the decision-making trial and evaluation laboratory (DEMATEL) approach on risk assessment. The proposed approach resolves some of the shortcomings of the conventional RPN method. A case study, which assesses the risk of 0.15 µm DRAM etching process, is used to demonstrate the effectiveness of the proposed approach. Finally, the result of the proposed method is compared with the listing approaches of risk assessment methods.

  20. A High-Order Immersed Boundary Method for Acoustic Wave Scattering and Low-Mach Number Flow-Induced Sound in Complex Geometries

    PubMed Central

    Seo, Jung Hee; Mittal, Rajat

    2010-01-01

    A new sharp-interface immersed boundary method based approach for the computation of low-Mach number flow-induced sound around complex geometries is described. The underlying approach is based on a hydrodynamic/acoustic splitting technique where the incompressible flow is first computed using a second-order accurate immersed boundary solver. This is followed by the computation of sound using the linearized perturbed compressible equations (LPCE). The primary contribution of the current work is the development of a versatile, high-order accurate immersed boundary method for solving the LPCE in complex domains. This new method applies the boundary condition on the immersed boundary to a high-order by combining the ghost-cell approach with a weighted least-squares error method based on a high-order approximating polynomial. The method is validated for canonical acoustic wave scattering and flow-induced noise problems. Applications of this technique to relatively complex cases of practical interest are also presented. PMID:21318129

  1. Phase shifts in I = 2 ππ-scattering from two lattice approaches

    NASA Astrophysics Data System (ADS)

    Kurth, T.; Ishii, N.; Doi, T.; Aoki, S.; Hatsuda, T.

    2013-12-01

    We present a lattice QCD study of the phase shift of I = 2 ππ scattering on the basis of two different approaches: the standard finite volume approach by Lüscher and the recently introduced HAL QCD potential method. Quenched QCD simulations are performed on lattices with extents N s = 16 , 24 , 32 , 48 and N t = 128 as well as lattice spacing a ~ 0 .115 fm and a pion mass of m π ~ 940 MeV. The phase shift and the scattering length are calculated in these two methods. In the potential method, the error is dominated by the systematic uncertainty associated with the violation of rotational symmetry due to finite lattice spacing. In Lüscher's approach, such systematic uncertainty is difficult to be evaluated and thus is not included in this work. A systematic uncertainty attributed to the quenched approximation, however, is not evaluated in both methods. In case of the potential method, the phase shift can be calculated for arbitrary energies below the inelastic threshold. The energy dependence of the phase shift is also obtained from Lüscher's method using different volumes and/or nonrest-frame extension of it. The results are found to agree well with the potential method.

  2. An Ontology to Improve Transparency in Case Definition and Increase Case Finding of Infectious Intestinal Disease: Database Study in English General Practice

    PubMed Central

    2017-01-01

    Background Infectious intestinal disease (IID) has considerable health impact; there are 2 billion cases worldwide resulting in 1 million deaths and 78.7 million disability-adjusted life years lost. Reported IID incidence rates vary and this is partly because terms such as “diarrheal disease” and “acute infectious gastroenteritis” are used interchangeably. Ontologies provide a method of transparently comparing case definitions and disease incidence rates. Objective This study sought to show how differences in case definition in part account for variation in incidence estimates for IID and how an ontological approach provides greater transparency to IID case finding. Methods We compared three IID case definitions: (1) Royal College of General Practitioners Research and Surveillance Centre (RCGP RSC) definition based on mapping to the Ninth International Classification of Disease (ICD-9), (2) newer ICD-10 definition, and (3) ontological case definition. We calculated incidence rates and examined the contribution of four supporting concepts related to IID: symptoms, investigations, process of care (eg, notification to public health authorities), and therapies. We created a formal ontology using ontology Web language. Results The ontological approach identified 5712 more cases of IID than the ICD-10 definition and 4482 more than the RCGP RSC definition from an initial cohort of 1,120,490. Weekly incidence using the ontological definition was 17.93/100,000 (95% CI 15.63-20.41), whereas for the ICD-10 definition the rate was 8.13/100,000 (95% CI 6.70-9.87), and for the RSC definition the rate was 10.24/100,000 (95% CI 8.55-12.12). Codes from the four supporting concepts were generally consistent across our three IID case definitions: 37.38% (3905/10,448) (95% CI 36.16-38.5) for the ontological definition, 38.33% (2287/5966) (95% CI 36.79-39.93) for the RSC definition, and 40.82% (1933/4736) (95% CI 39.03-42.66) for the ICD-10 definition. The proportion of laboratory results associated with a positive test result was 19.68% (546/2775). Conclusions The standard RCGP RSC definition of IID, and its mapping to ICD-10, underestimates disease incidence. The ontological approach identified a larger proportion of new IID cases; the ontology divides contributory elements and enables transparency and comparison of rates. Results illustrate how improved diagnostic coding of IID combined with an ontological approach to case definition would provide a clearer picture of IID in the community, better inform GPs and public health services about circulating disease, and empower them to respond. We need to improve the Pathology Bounded Code List (PBCL) currently used by laboratories to electronically report results. Given advances in stool microbiology testing with a move to nonculture, PCR-based methods, the way microbiology results are reported and coded via PBCL needs to be reviewed and modernized. PMID:28958989

  3. Transnasal endoscopic approach with powered instrumentation for treating squamous papilloma in the nasopharyngeal surface of the soft palate.

    PubMed

    Lee, J-H; Lee, Y-O; Lee, C-H; Cho, K-S

    2013-05-01

    To demonstrate a safe and effective method for complete resection of squamous papilloma in the nasopharyngeal surface of the soft palate. This technique was used on a patient in whom the papilloma had twice recurred following uvulopalatopharyngoplasty. Case report and review of the relevant literature. The patient reported in this paper had recurrent squamous papilloma in the nasopharyngeal surface of the soft palate following uvulopalatopharyngoplasty. He also suffered from nasal regurgitation when drinking water. This lesion, which was difficult to access, was successfully treated via a transnasal endoscopic approach using powered instrumentation. This case report highlights a novel approach for the complete removal of a recurrent papilloma in a relatively inaccessible location. Compared with a transoral approach such as uvulopalatopharyngoplasty, the transnasal endoscopic approach using powered instrumentation could provide a safer, faster, easier and less invasive means of treating squamous papilloma in the nasopharyngeal surface of the soft palate, especially for a lesion that recurs following a transoral approach.

  4. Direct sampling for stand density index

    Treesearch

    Mark J. Ducey; Harry T. Valentine

    2008-01-01

    A direct method of estimating stand density index in the field, without complex calculations, would be useful in a variety of silvicultural situations. We present just such a method. The approach uses an ordinary prism or other angle gauge, but it involves deliberately "pushing the point" or, in some cases, "pulling the point." This adjusts the...

  5. Pluralistic Inquiry for the History of Community Psychology

    ERIC Educational Resources Information Center

    Kelly, James G.; Chang, Janet

    2008-01-01

    The authors present the case not only for studying the history of community psychology but also of adopting a pluralistic approach to historical inquiry, using multiple methods and access to resources from other disciplines (e.g., historians of science and social historians). Examples of substantive topics and methods, including social network and…

  6. Research Knowledge Transfer through Business-Driven Student Assignment

    ERIC Educational Resources Information Center

    Sas, Corina

    2009-01-01

    Purpose: The purpose of this paper is to present a knowledge transfer method that capitalizes on both research and teaching dimensions of academic work. It also aims to propose a framework for evaluating the impact of such a method on the involved stakeholders. Design/methodology/approach: The case study outlines and evaluates the six-stage…

  7. Flexible Delivery as a "Whole-Organisation": What Does This Mean in Practice?

    ERIC Educational Resources Information Center

    Henry, John; Wakefield, Lyn

    A research project called Support Services for Flexible Delivery was commissioned by the Australian organization TAFE (technical and further education) Frontiers. Since 1995, the project has been conducted by using a research approach called the Generalizations from Case Studies (GCS) research method. The GCS method was developed, tested, and…

  8. Assessing and Evaluating Multidisciplinary Translational Teams: A Mixed Methods Approach

    PubMed Central

    Wooten, Kevin C.; Rose, Robert M.; Ostir, Glenn V.; Calhoun, William J.; Ameredes, Bill T.; Brasier, Allan R.

    2014-01-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team type taxonomy. Based on team maturation and scientific progress, teams were designated as: a) early in development, b) traditional, c) process focused, or d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored. PMID:24064432

  9. Autocorrelated process control: Geometric Brownian Motion approach versus Box-Jenkins approach

    NASA Astrophysics Data System (ADS)

    Salleh, R. M.; Zawawi, N. I.; Gan, Z. F.; Nor, M. E.

    2018-04-01

    Existing of autocorrelation will bring a significant effect on the performance and accuracy of process control if the problem does not handle carefully. When dealing with autocorrelated process, Box-Jenkins method will be preferred because of the popularity. However, the computation of Box-Jenkins method is too complicated and challenging which cause of time-consuming. Therefore, an alternative method which known as Geometric Brownian Motion (GBM) is introduced to monitor the autocorrelated process. One real case of furnace temperature data is conducted to compare the performance of Box-Jenkins and GBM methods in monitoring autocorrelation process. Both methods give the same results in terms of model accuracy and monitoring process control. Yet, GBM is superior compared to Box-Jenkins method due to its simplicity and practically with shorter computational time.

  10. On determining firing delay time of transitions for Petri net based signaling pathways by introducing stochastic decision rules.

    PubMed

    Miwa, Yoshimasa; Li, Chen; Ge, Qi-Wei; Matsuno, Hiroshi; Miyano, Satoru

    2010-01-01

    Parameter determination is important in modeling and simulating biological pathways including signaling pathways. Parameters are determined according to biological facts obtained from biological experiments and scientific publications. However, such reliable data describing detailed reactions are not reported in most cases. This prompted us to develop a general methodology of determining the parameters of a model in the case of that no information of the underlying biological facts is provided. In this study, we use the Petri net approach for modeling signaling pathways, and propose a method to determine firing delay times of transitions for Petri net models of signaling pathways by introducing stochastic decision rules. Petri net technology provides a powerful approach to modeling and simulating various concurrent systems, and recently have been widely accepted as a description method for biological pathways. Our method enables to determine the range of firing delay time which realizes smooth token flows in the Petri net model of a signaling pathway. The availability of this method has been confirmed by the results of an application to the interleukin-1 induced signaling pathway.

  11. On determining firing delay time of transitions for petri net based signaling pathways by introducing stochastic decision rules.

    PubMed

    Miwa, Yoshimasa; Li, Chen; Ge, Qi-Wei; Matsuno, Hiroshi; Miyano, Satoru

    2011-01-01

    Parameter determination is important in modeling and simulating biological pathways including signaling pathways. Parameters are determined according to biological facts obtained from biological experiments and scientific publications. However, such reliable data describing detailed reactions are not reported in most cases. This prompted us to develop a general methodology of determining the parameters of a model in the case of that no information of the underlying biological facts is provided. In this study, we use the Petri net approach for modeling signaling pathways, and propose a method to determine firing delay times of transitions for Petri net models of signaling pathways by introducing stochastic decision rules. Petri net technology provides a powerful approach to modeling and simulating various concurrent systems, and recently have been widely accepted as a description method for biological pathways. Our method enables to determine the range of firing delay time which realizes smooth token flows in the Petri net model of a signaling pathway. The availability of this method has been confirmed by the results of an application to the interleukin-1 induced signaling pathway.

  12. A reflective lens: applying critical systems thinking and visual methods to ecohealth research.

    PubMed

    Cleland, Deborah; Wyborn, Carina

    2010-12-01

    Critical systems methodology has been advocated as an effective and ethical way to engage with the uncertainty and conflicting values common to ecohealth problems. We use two contrasting case studies, coral reef management in the Philippines and national park management in Australia, to illustrate the value of critical systems approaches in exploring how people respond to environmental threats to their physical and spiritual well-being. In both cases, we used visual methods--participatory modeling and rich picturing, respectively. The critical systems methodology, with its emphasis on reflection, guided an appraisal of the research process. A discussion of these two case studies suggests that visual methods can be usefully applied within a critical systems framework to offer new insights into ecohealth issues across a diverse range of socio-political contexts. With this article, we hope to open up a conversation with other practitioners to expand the use of visual methods in integrated research.

  13. Implementing a flipped classroom approach in a university numerical methods mathematics course

    NASA Astrophysics Data System (ADS)

    Johnston, Barbara M.

    2017-05-01

    This paper describes and analyses the implementation of a 'flipped classroom' approach, in an undergraduate mathematics course on numerical methods. The approach replaced all the lecture contents by instructor-made videos and was implemented in the consecutive years 2014 and 2015. The sequential case study presented here begins with an examination of the attitudes of the 2014 cohort to the approach in general as well as analysing their use of the videos. Based on these responses, the instructor makes a number of changes (for example, the use of 'cloze' summary notes and the introduction of an extra, optional tutorial class) before repeating the 'flipped classroom' approach the following year. The attitudes to the approach and the video usage of the 2015 cohort are then compared with the 2014 cohort and further changes that could be implemented for the next cohort are suggested.

  14. An extension of the directed search domain algorithm to bilevel optimization

    NASA Astrophysics Data System (ADS)

    Wang, Kaiqiang; Utyuzhnikov, Sergey V.

    2017-08-01

    A method is developed for generating a well-distributed Pareto set for the upper level in bilevel multiobjective optimization. The approach is based on the Directed Search Domain (DSD) algorithm, which is a classical approach for generation of a quasi-evenly distributed Pareto set in multiobjective optimization. The approach contains a double-layer optimizer designed in a specific way under the framework of the DSD method. The double-layer optimizer is based on bilevel single-objective optimization and aims to find a unique optimal Pareto solution rather than generate the whole Pareto frontier on the lower level in order to improve the optimization efficiency. The proposed bilevel DSD approach is verified on several test cases, and a relevant comparison against another classical approach is made. It is shown that the approach can generate a quasi-evenly distributed Pareto set for the upper level with relatively low time consumption.

  15. Using the social entrepreneurship approach to generate innovative and sustainable malaria diagnosis interventions in Tanzania: a case study

    PubMed Central

    2010-01-01

    Background There have been a number of interventions to date aimed at improving malaria diagnostic accuracy in sub-Saharan Africa. Yet, limited success is often reported for a number of reasons, especially in rural settings. This paper seeks to provide a framework for applied research aimed to improve malaria diagnosis using a combination of the established methods, participatory action research and social entrepreneurship. Methods This case study introduces the idea of using the social entrepreneurship approach (SEA) to create innovative and sustainable applied health research outcomes. The following key elements define the SEA: (1) identifying a locally relevant research topic and plan, (2) recognizing the importance of international multi-disciplinary teams and the incorporation of local knowledge, (3) engaging in a process of continuous innovation, adaptation and learning, (4) remaining motivated and determined to achieve sustainable long-term research outcomes and, (5) sharing and transferring ownership of the project with the international and local partner. Evaluation The SEA approach has a strong emphasis on innovation lead by local stakeholders. In this case, innovation resulted in a unique holistic research program aimed at understanding patient, laboratory and physician influences on accurate diagnosis of malaria. An evaluation of milestones for each SEA element revealed that the success of one element is intricately related to the success of other elements. Conclusions The SEA will provide an additional framework for researchers and local stakeholders that promotes innovation and adaptability. This approach will facilitate the development of new ideas, strategies and approaches to understand how health issues, such as malaria, affect vulnerable communities. PMID:20128922

  16. Integrated rare variant-based risk gene prioritization in disease case-control sequencing studies.

    PubMed

    Lin, Jhih-Rong; Zhang, Quanwei; Cai, Ying; Morrow, Bernice E; Zhang, Zhengdong D

    2017-12-01

    Rare variants of major effect play an important role in human complex diseases and can be discovered by sequencing-based genome-wide association studies. Here, we introduce an integrated approach that combines the rare variant association test with gene network and phenotype information to identify risk genes implicated by rare variants for human complex diseases. Our data integration method follows a 'discovery-driven' strategy without relying on prior knowledge about the disease and thus maintains the unbiased character of genome-wide association studies. Simulations reveal that our method can outperform a widely-used rare variant association test method by 2 to 3 times. In a case study of a small disease cohort, we uncovered putative risk genes and the corresponding rare variants that may act as genetic modifiers of congenital heart disease in 22q11.2 deletion syndrome patients. These variants were missed by a conventional approach that relied on the rare variant association test alone.

  17. Filling gaps in notification data: a model-based approach applied to travel related campylobacteriosis cases in New Zealand.

    PubMed

    Amene, E; Horn, B; Pirie, R; Lake, R; Döpfer, D

    2016-09-06

    Data containing notified cases of disease are often compromised by incomplete or partial information related to individual cases. In an effort to enhance the value of information from enteric disease notifications in New Zealand, this study explored the use of Bayesian and Multiple Imputation (MI) models to fill risk factor data gaps. As a test case, overseas travel as a risk factor for infection with campylobacteriosis has been examined. Two methods, namely Bayesian Specification (BAS) and Multiple Imputation (MI), were compared regarding predictive performance for various levels of artificially induced missingness of overseas travel status in campylobacteriosis notification data. Predictive performance of the models was assessed through the Brier Score, the Area Under the ROC Curve and the Percent Bias of regression coefficients. Finally, the best model was selected and applied to predict missing overseas travel status of campylobacteriosis notifications. While no difference was observed in the predictive performance of the BAS and MI methods at a lower rate of missingness (<10 %), but the BAS approach performed better than MI at a higher rate of missingness (50 %, 65 %, 80 %). The estimated proportion (95 % Credibility Intervals) of travel related cases was greatest in highly urban District Health Boards (DHBs) in Counties Manukau, Auckland and Waitemata, at 0.37 (0.12, 0.57), 0.33 (0.13, 0.55) and 0.28 (0.10, 0.49), whereas the lowest proportion was estimated for more rural West Coast, Northland and Tairawhiti DHBs at 0.02 (0.01, 0.05), 0.03 (0.01, 0.08) and 0.04 (0.01, 0.06), respectively. The national rate of travel related campylobacteriosis cases was estimated at 0.16 (0.02, 0.48). The use of BAS offers a flexible approach to data augmentation particularly when the missing rate is very high and when the Missing At Random (MAR) assumption holds. High rates of travel associated cases in urban regions of New Zealand predicted by this approach are plausible given the high rate of travel in these regions, including destinations with higher risk of infection. The added advantage of using a Bayesian approach is that the model's prediction can be improved whenever new information becomes available.

  18. The effects of particulate air pollution on daily deaths: a multi-city case crossover analysis

    PubMed Central

    Schwartz, J

    2004-01-01

    Background: Numerous studies have reported that day-to-day changes in particulate air pollution are associated with day-to-day changes in deaths. Recently, several reports have indicated that the software used to control for season and weather in some of these studies had deficiencies. Aims: To investigate the use of the case-crossover design as an alternative. Methods: This approach compares the exposure of each case to their exposure on a nearby day, when they did not die. Hence it controls for seasonal patterns and for all slowly varying covariates (age, smoking, etc) by matching rather than complex modelling. A key feature is that temperature can also be controlled by matching. This approach was applied to a study of 14 US cities. Weather and day of the week were controlled for in the regression. Results: A 10 µg/m3 increase in PM10 was associated with a 0.36% increase in daily deaths from internal causes (95% CI 0.22% to 0.50%). Results were little changed if, instead of symmetrical sampling of control days the time stratified method was applied, when control days were matched on temperature, or when more lags of winter time temperatures were used. Similar results were found using a Poisson regression, but the case-crossover method has the advantage of simplicity in modelling, and of combining matched strata across multiple locations in a single stage analysis. Conclusions: Despite the considerable differences in analytical design, the previously reported associations of particles with mortality persisted in this study. The association appeared quite linear. Case-crossover designs represent an attractive method to control for season and weather by matching. PMID:15550600

  19. Green's function calculations for semi-infinite carbon nanotubes

    NASA Astrophysics Data System (ADS)

    John, D. L.; Pulfrey, D. L.

    2006-02-01

    In the modeling of nanoscale electronic devices, the non-equilibrium Green's function technique is gaining increasing popularity. One complication in this method is the need for computation of the self-energy functions that account for the interactions between the active portion of a device and its leads. In the one-dimensional case, these functions may be computed analytically. In higher dimensions, a numerical approach is required. In this work, we generalize earlier methods that were developed for tight-binding Hamiltonians, and present results for the case of a carbon nanotube.

  20. A comparison of two central difference schemes for solving the Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Maksymiuk, C. M.; Swanson, R. C.; Pulliam, T. H.

    1990-01-01

    Five viscous transonic airfoil cases were computed by two significantly different computational fluid dynamics codes: An explicit finite-volume algorithm with multigrid, and an implicit finite-difference approximate-factorization method with Eigenvector diagonalization. Both methods are described in detail, and their performance on the test cases is compared. The codes utilized the same grids, turbulence model, and computer to provide the truest test of the algorithms. The two approaches produce very similar results, which, for attached flows, also agree well with experimental results; however, the explicit code is considerably faster.

  1. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  2. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  3. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  4. Similarity based false-positive reduction for breast cancer using radiographic and pathologic imaging features

    NASA Astrophysics Data System (ADS)

    Pai, Akshay; Samala, Ravi K.; Zhang, Jianying; Qian, Wei

    2010-03-01

    Mammography reading by radiologists and breast tissue image interpretation by pathologists often leads to high False Positive (FP) Rates. Similarly, current Computer Aided Diagnosis (CADx) methods tend to concentrate more on sensitivity, thus increasing the FP rates. A novel method is introduced here which employs similarity based method to decrease the FP rate in the diagnosis of microcalcifications. This method employs the Principal Component Analysis (PCA) and the similarity metrics in order to achieve the proposed goal. The training and testing set is divided into generalized (Normal and Abnormal) and more specific (Abnormal, Normal, Benign) classes. The performance of this method as a standalone classification system is evaluated in both the cases (general and specific). In another approach the probability of each case belonging to a particular class is calculated. If the probabilities are too close to classify, the augmented CADx system can be instructed to have a detailed analysis of such cases. In case of normal cases with high probability, no further processing is necessary, thus reducing the computation time. Hence, this novel method can be employed in cascade with CADx to reduce the FP rate and also avoid unnecessary computational time. Using this methodology, a false positive rate of 8% and 11% is achieved for mammography and cellular images respectively.

  5. The Utility of Case-Control Methods for Health Policy and Planning Analysis: An Illustration from Kinshasa, Zaire.

    ERIC Educational Resources Information Center

    Mock, Nancy B.; And Others

    1993-01-01

    The use of case-control methodology as an applied policy/planning research tool in assessing the potential effectiveness of behavioral interventions is studied in connection with diarrhea control in Zaire. Results with 107 matched pairs of children demonstrate the importance of hygiene-related knowledge and the utility of the research approach.…

  6. Fostering Interdisciplinary Research in Universities: A Case Study of Leadership, Alignment and Support

    ERIC Educational Resources Information Center

    Townsend, Tony; Pisapia, John; Razzaq, Jamila

    2015-01-01

    The aim of this paper is to describe actions designed to foster interdisciplinary research efforts at a major university in the UK. The study employed a descriptive mixed method case study approach to collecting and analysing the data used to draw its conclusions. One hundred and twenty-seven academic staff responded to the survey. The results of…

  7. Symmetry considerations in the scattering of identical composite bodies

    NASA Technical Reports Server (NTRS)

    Norbury, J. W.; Townsend, L. W.; Deutchman, P. A.

    1986-01-01

    Previous studies of the interactions between composite particles were extended to the case in which the composites are identical. The form of the total interaction potential matrix elements was obtained, and guidelines for their explicit evaluation were given. For the case of elastic scattering of identical composites, the matrix element approach was shown to be equivalent to the scattering amplitude method.

  8. The Development of Case Studies as a Method within a Longitudinal Study of Special Educational Needs Provision in the Republic of Ireland

    ERIC Educational Resources Information Center

    Rose, Richard; Shevlin, Michael

    2016-01-01

    When developing case studies within a longitudinal study of special educational needs provision within the Republic of Ireland, the authors were conscious of the critiques of the use of this approach within educational research. The difficulties associated with generalisation, challenges of ensuring trustworthiness and the possibilities of…

  9. The Economic Burden of Child Maltreatment in the United States and Implications for Prevention

    ERIC Educational Resources Information Center

    Fang, Xiangming; Brown, Derek S.; Florence, Curtis S.; Mercy, James A.

    2012-01-01

    Objectives: To present new estimates of the average lifetime costs per child maltreatment victim and aggregate lifetime costs for all new child maltreatment cases incurred in 2008 using an incidence-based approach. Methods: This study used the best available secondary data to develop cost per case estimates. For each cost category, the paper used…

  10. Young Workers and Their Dispositions towards Mathematics: Tensions of a Mathematical Habitus in the Retail Industry

    ERIC Educational Resources Information Center

    Jorgensen Zebenbergen, Robyn

    2011-01-01

    This paper presents a case study of contemporary retail industry and the ways in which young workers participate in that field. Public perceptions of low numeracy among young people provided the catalyst for the study. Drawing on a mixed-method approach involving survey, case studies, stimulated recall, observations, and interviews, it was found…

  11. Comparison of Classical and Lazy Approach in SCG Compiler

    NASA Astrophysics Data System (ADS)

    Jirák, Ota; Kolář, Dušan

    2011-09-01

    The existing parsing methods of scattered context grammar usually expand nonterminals deeply in the pushdown. This expansion is implemented by using either a linked list, or some kind of an auxiliary pushdown. This paper describes the parsing algorithm of an LL(1) scattered context grammar. The given algorithm merges two principles together. The first approach is a table-driven parsing method commonly used for parsing of the context-free grammars. The second is a delayed execution used in functional programming. The main part of this paper is a proof of equivalence between the common principle (the whole rule is applied at once) and our approach (execution of the rules is delayed). Therefore, this approach works with the pushdown top only. In the most cases, the second approach is faster than the first one. Finally, the future work is discussed.

  12. An Integrated Approach for Gear Health Prognostics

    NASA Technical Reports Server (NTRS)

    He, David; Bechhoefer, Eric; Dempsey, Paula; Ma, Jinghua

    2012-01-01

    In this paper, an integrated approach for gear health prognostics using particle filters is presented. The presented method effectively addresses the issues in applying particle filters to gear health prognostics by integrating several new components into a particle filter: (1) data mining based techniques to effectively define the degradation state transition and measurement functions using a one-dimensional health index obtained by whitening transform; (2) an unbiased l-step ahead RUL estimator updated with measurement errors. The feasibility of the presented prognostics method is validated using data from a spiral bevel gear case study.

  13. Fundamentals of electrochemical detection techniques for CE and MCE.

    PubMed

    Kubán, Pavel; Hauser, Peter C

    2009-10-01

    The electroanalytical techniques of amperometry, conductometry and potentiometry match well with the instrumental simplicity of CE. Indeed, all three detection approaches have been reported for electrophoretic separations. However, the characteristics of the three methods are quite distinct and these are not related to the optical methods more commonly employed. A detailed discussion of the underlying principles of each is given. The issue of possible effects of the separation voltage on the electrochemical detection techniques is considered in depth, and approaches to the elimination of such interferences are also discussed for each case.

  14. Path integral approach to closed-form option pricing formulas with applications to stochastic volatility and interest rate models

    NASA Astrophysics Data System (ADS)

    Lemmens, D.; Wouters, M.; Tempere, J.; Foulon, S.

    2008-07-01

    We present a path integral method to derive closed-form solutions for option prices in a stochastic volatility model. The method is explained in detail for the pricing of a plain vanilla option. The flexibility of our approach is demonstrated by extending the realm of closed-form option price formulas to the case where both the volatility and interest rates are stochastic. This flexibility is promising for the treatment of exotic options. Our analytical formulas are tested with numerical Monte Carlo simulations.

  15. On the method of least squares. II. [for calculation of covariance matrices and optimization algorithms

    NASA Technical Reports Server (NTRS)

    Jefferys, W. H.

    1981-01-01

    A least squares method proposed previously for solving a general class of problems is expanded in two ways. First, covariance matrices related to the solution are calculated and their interpretation is given. Second, improved methods of solving the normal equations related to those of Marquardt (1963) and Fletcher and Powell (1963) are developed for this approach. These methods may converge in cases where Newton's method diverges or converges slowly.

  16. Deep learning approach to bacterial colony classification.

    PubMed

    Zieliński, Bartosz; Plichta, Anna; Misztal, Krzysztof; Spurek, Przemysław; Brzychczy-Włoch, Monika; Ochońska, Dorota

    2017-01-01

    In microbiology it is diagnostically useful to recognize various genera and species of bacteria. It can be achieved using computer-aided methods, which make the recognition processes more automatic and thus significantly reduce the time necessary for the classification. Moreover, in case of diagnostic uncertainty (the misleading similarity in shape or structure of bacterial cells), such methods can minimize the risk of incorrect recognition. In this article, we apply the state of the art method for texture analysis to classify genera and species of bacteria. This method uses deep Convolutional Neural Networks to obtain image descriptors, which are then encoded and classified with Support Vector Machine or Random Forest. To evaluate this approach and to make it comparable with other approaches, we provide a new dataset of images. DIBaS dataset (Digital Image of Bacterial Species) contains 660 images with 33 different genera and species of bacteria.

  17. A novel approach to generating CER hypotheses based on mining clinical data.

    PubMed

    Zhang, Shuo; Li, Lin; Yu, Yiqin; Sun, Xingzhi; Xu, Linhao; Zhao, Wei; Teng, Xiaofei; Pan, Yue

    2013-01-01

    Comparative effectiveness research (CER) is a scientific method of investigating the effectiveness of alternative intervention methods. In a CER study, clinical researchers typically start with a CER hypothesis, and aim to evaluate it by applying a series of medical statistical methods. Traditionally, the CER hypotheses are defined manually by clinical researchers. This makes the task of hypothesis generation very time-consuming and the quality of hypothesis heavily dependent on the researchers' skills. Recently, with more electronic medical data being collected, it is highly promising to apply the computerized method for discovering CER hypotheses from clinical data sets. In this poster, we proposes a novel approach to automatically generating CER hypotheses based on mining clinical data, and presents a case study showing that the approach can facilitate clinical researchers to identify potentially valuable hypotheses and eventually define high quality CER studies.

  18. Particle swarm optimization method for small retinal vessels detection on multiresolution fundus images.

    PubMed

    Khomri, Bilal; Christodoulidis, Argyrios; Djerou, Leila; Babahenini, Mohamed Chaouki; Cheriet, Farida

    2018-05-01

    Retinal vessel segmentation plays an important role in the diagnosis of eye diseases and is considered as one of the most challenging tasks in computer-aided diagnosis (CAD) systems. The main goal of this study was to propose a method for blood-vessel segmentation that could deal with the problem of detecting vessels of varying diameters in high- and low-resolution fundus images. We proposed to use the particle swarm optimization (PSO) algorithm to improve the multiscale line detection (MSLD) method. The PSO algorithm was applied to find the best arrangement of scales in the MSLD method and to handle the problem of multiscale response recombination. The performance of the proposed method was evaluated on two low-resolution (DRIVE and STARE) and one high-resolution fundus (HRF) image datasets. The data include healthy (H) and diabetic retinopathy (DR) cases. The proposed approach improved the sensitivity rate against the MSLD by 4.7% for the DRIVE dataset and by 1.8% for the STARE dataset. For the high-resolution dataset, the proposed approach achieved 87.09% sensitivity rate, whereas the MSLD method achieves 82.58% sensitivity rate at the same specificity level. When only the smallest vessels were considered, the proposed approach improved the sensitivity rate by 11.02% and by 4.42% for the healthy and the diabetic cases, respectively. Integrating the proposed method in a comprehensive CAD system for DR screening would allow the reduction of false positives due to missed small vessels, misclassified as red lesions. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  19. A fuzzy case based reasoning tool for model based approach to rocket engine health monitoring

    NASA Technical Reports Server (NTRS)

    Krovvidy, Srinivas; Nolan, Adam; Hu, Yong-Lin; Wee, William G.

    1992-01-01

    In this system we develop a fuzzy case based reasoner that can build a case representation for several past anomalies detected, and we develop case retrieval methods that can be used to index a relevant case when a new problem (case) is presented using fuzzy sets. The choice of fuzzy sets is justified by the uncertain data. The new problem can be solved using knowledge of the model along with the old cases. This system can then be used to generalize the knowledge from previous cases and use this generalization to refine the existing model definition. This in turn can help to detect failures using the model based algorithms.

  20. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study

    DTIC Science & Technology

    1997-03-01

    Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools

  1. Controlled-Root Approach To Digital Phase-Locked Loops

    NASA Technical Reports Server (NTRS)

    Stephens, Scott A.; Thomas, J. Brooks

    1995-01-01

    Performance tailored more flexibly and directly to satisfy design requirements. Controlled-root approach improved method for analysis and design of digital phase-locked loops (DPLLs). Developed rigorously from first principles for fully digital loops, making DPLL theory and design simpler and more straightforward (particularly for third- or fourth-order DPLL) and controlling performance more accurately in case of high gain.

  2. The Guided Reading Approach: A Practical Method to Address Diverse Needs in the Classroom

    ERIC Educational Resources Information Center

    Schaffer, Laura M.; Schirmer, Barbara R.

    2010-01-01

    Many deaf students struggle with learning to read. This is the case nationally as well as at the Michigan School for the Deaf (MSD). In 2006, the elementary teaching staff began working together to implement a change in their reading instruction so their approach would be systematic and consistent across grade levels. With the diverse backgrounds…

  3. Combining Event- and Variable-Centred Approaches to Institution-Facing Learning Analytics at the Unit of Study Level

    ERIC Educational Resources Information Center

    Kelly, Nick; Montenegro, Maximiliano; Gonzalez, Carlos; Clasing, Paula; Sandoval, Augusto; Jara, Magdalena; Saurina, Elvira; Alarcón, Rosa

    2017-01-01

    Purpose: The purpose of this paper is to demonstrate the utility of combining event-centred and variable-centred approaches when analysing big data for higher education institutions. It uses a large, university-wide data set to demonstrate the methodology for this analysis by using the case study method. It presents empirical findings about…

  4. A Scientific Approach to Monitoring Public Perceptions of Scientific Issues

    ERIC Educational Resources Information Center

    Fisher, N. I.; Lee, A. J.; Cribb, J. H. J.

    2013-01-01

    This article reports on a three-year study to evaluate a new approach to increasing the impact and adoption of new scientific findings and technologies. The purpose of the case study was to monitor the public's perception of the severity of problems posed by invasive animal species and of possible methods of managing them. A real-time "moving…

  5. Endoscopic endonasal control of the paraclival internal carotid artery by Fogarty balloon catheter inflation: an anatomical study.

    PubMed

    Ruggeri, Andrea; Enseñat, Joaquim; Prats-Galino, Alberto; Lopez-Rueda, Antonio; Berenguer, Joan; Cappelletti, Martina; De Notaris, Matteo; d'Avella, Elena

    2017-03-01

    OBJECTIVE Neurosurgical management of many vascular and neoplastic lesions necessitates control of the internal carotid artery (ICA). The aim of this study was to investigate the feasibility of achieving control of the ICA through the endoscopic endonasal approach by temporary occlusion with a Fogarty balloon catheter. METHODS Ten endoscopic endonasal paraseptal approaches were performed on cadaveric specimens. A Fogarty balloon catheter was inserted through a sellar bony opening and pushed laterally and posteriorly extraarterially along the paraclival carotid artery. The balloon was then inflated, thus achieving temporary occlusion of the vessel. The position of the catheter was confirmed with CT scans, and occlusion of the ICA was demonstrated with angiography. The technique was performed in 2 surgical cases of pituitary macroadenoma with cavernous sinus invasion. RESULTS Positioning the Fogarty balloon catheter at the level of the paraclival ICA was achieved in all cadaveric dissections and surgical cases through a minimally invasive, quick, and safe approach. Inflation of the Fogarty balloon caused interruption of blood flow in 100% of cases. CONCLUSIONS Temporary occlusion of the paraclival ICA performed through the endoscopic endonasal route with the aid of a Fogarty balloon catheter may be another maneuver for dealing with intraoperative ICA control. Further clinical studies are required to prove the efficacy of this method.

  6. Variational formulation of macroparticle models for electromagnetic plasma simulations

    DOE PAGES

    Stamm, Alexander B.; Shadwick, Bradley A.; Evstatiev, Evstati G.

    2014-06-01

    A variational method is used to derive a self-consistent macroparticle model for relativistic electromagnetic kinetic plasma simulations. Extending earlier work, discretization of the electromagnetic Low Lagrangian is performed via a reduction of the phase-space distribution function onto a collection of finite-sized macroparticles of arbitrary shape and discretization of field quantities onto a spatial grid. This approach may be used with lab frame coordinates or moving window coordinates; the latter can greatly improve computational efficiency for studying some types of laser-plasma interactions. The primary advantage of the variational approach is the preservation of Lagrangian symmetries, which in our case leads tomore » energy conservation and thus avoids difficulties with grid heating. In addition, this approach decouples particle size from grid spacing and relaxes restrictions on particle shape, leading to low numerical noise. The variational approach also guarantees consistent approximations in the equations of motion and is amenable to higher order methods in both space and time. We restrict our attention to the 1.5-D case (one coordinate and two momenta). Lastly, simulations are performed with the new models and demonstrate energy conservation and low noise.« less

  7. 3D morphometry of red blood cells by digital holography.

    PubMed

    Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Gennari, Oriella; Netti, Paolo Antonio; Ferraro, Pietro

    2014-12-01

    Three dimensional (3D) morphometric analysis of flowing and not-adherent cells is an important aspect for diagnostic purposes. However, diagnostics tools need to be quantitative, label-free and, as much as possible, accurate. Recently, a simple holographic approach, based on shape from silhouette algorithm, has been demonstrated for accurate calculation of cells biovolume and displaying their 3D shapes. Such approach has been adopted in combination with holographic optical tweezers and successfully applied to cells with convex shape. Nevertheless, unfortunately, the method fails in case of specimen with concave surfaces. Here, we propose an effective approach to achieve correct 3D shape measurement that can be extended in case of cells having concave surfaces, thus overcoming the limit of the previous technique. We prove the new procedure for healthy red blood cells (RBCs) (i.e., discocytes) having a concave surface in their central region. Comparative analysis of experimental results with a theoretical 3D geometrical model of RBC is discussed in order to evaluate accuracy of the proposed approach. Finally, we show that the method can be also useful to classify, in terms of morphology, different varieties of RBCs. © 2014 International Society for Advancement of Cytometry.

  8. ‘Patient-Centered Care’ for Complex Patients with Type 2 Diabetes Mellitus—Analysis of Two Cases

    PubMed Central

    Hackel, Jennifer M.

    2013-01-01

    Purpose This paper serves to apply and compare aspects of person centered care and recent consensus guidelines to two cases of older adults with poorly controlled diabetes in the context of relatively similar multimorbidity. Methods After review of the literature regarding the shift from guidelines promoting tight control in diabetes management to individualized person centered care, as well as newer treatment approaches emerging in diabetes care, the newer guidelines and potential treatment approaches are applied to the cases. Results By delving into the clinical, behavioral, social, cultural and economic aspects of the two cases in applying the new guidelines, divergent care goals are reached for the cases. Conclusions Primary care practitioners must be vigilant in providing individualized diabetes treatment where multiple chronic illnesses increase the complexity of care. While two older adults with multimorbidity may appear at first to have similar care goals, their unique preferences and support systems, as well as their risks and benefits from tight control, must be carefully weighed in formulating the best approach. Newer pharmaceutical agents hold promise for improving the possibilities for better glycemic control with less self-care burden and risk of hypoglycemia. PMID:24250240

  9. Lingual Thyroid Carcinoma: A Case Report and Review of Surgical Approaches in the Literature.

    PubMed

    Stokes, William; Interval, Eric; Patel, Rusha

    2018-07-01

    Lingual thyroid cancer is a rare entity with a paucity of literature guiding methods of surgical treatment. Its location presents anatomic challenges with access and excision. We present a case of T4aN1b classical variant papillary thyroid carcinoma of the lingual thyroid that was removed without pharyngeal entry. We also present a review of the literature of this rare entity and propose a treatment algorithm to provide safe and oncologic outcomes. Our review of the literature found 28 case reports of lingual thyroid carcinoma that met search criteria. The trans-cervical/trans-hyoid approach was the most frequently used and provides safe oncologic outcomes. This was followed by the transoral approach and then lateral pharyngotomy. Complications reported across the series include 1 case of pharyngocutaneous fistula associated with mandibulotomy and postoperative respiratory distress requiring reintubation or emergent tracheostomy in 2 patients. The location of lingual thyroid carcinoma can be variable, and surgical management requires knowledge of adjacent involved structures to decrease the risk of dysphagia and airway compromise. In particular, for cases where there is extensive loss to swallowing mechanisms, laryngeal suspension can allow the patient to resume a normal diet after treatment.

  10. ANOTHER LOOK AT THE FAST ITERATIVE SHRINKAGE/THRESHOLDING ALGORITHM (FISTA)*

    PubMed Central

    Kim, Donghwan; Fessler, Jeffrey A.

    2017-01-01

    This paper provides a new way of developing the “Fast Iterative Shrinkage/Thresholding Algorithm (FISTA)” [3] that is widely used for minimizing composite convex functions with a nonsmooth term such as the ℓ1 regularizer. In particular, this paper shows that FISTA corresponds to an optimized approach to accelerating the proximal gradient method with respect to a worst-case bound of the cost function. This paper then proposes a new algorithm that is derived by instead optimizing the step coefficients of the proximal gradient method with respect to a worst-case bound of the composite gradient mapping. The proof is based on the worst-case analysis called Performance Estimation Problem in [11]. PMID:29805242

  11. [Recalibration via a postero-lateral approach in recent traumatic stenosis of the dorsal and lumbar spine. Modalities and results apropos of 31 cases].

    PubMed

    Richaud, J; Bousquet, P; Ealet, G; Clamens, J; Beltchika, K; Lazorthes, Y

    1990-01-01

    The authors present 31 cases of spinal trauma affecting thoraco lumbar level with severe spinal canal stenosis secondary to compressive trauma of the anterior disco-corpereal region. Associated neurological disorders were of varying severity. 23 cases were investigated by computed tomography. In all cases, the surgical procedure involved rectification of spinal deformities, with initially a unilateral postero-lateral approach permitting anterior spinal canal recalibration, either by impaction of protrusive fragments or ablation of free disc fragments. The stabilization was usually achieved by complementary bilateral plates using Roy-Camille or Privat material in 22 cases, associated with postero-lateral arthrodesis by grafting with reconstruction of the articulo-pedicular structure in 19 cases. Emergency operation was done in 14 cases; in 5 cases operation was done on the 2nd or 3rd day and in 11 cases after the 3rd day. The functional spinal result was excellent, and recalibration was verified by tomography in all cases. In those cases showing neurological deficiency, good and early recovery was attributable to the suppression of spinal canal stenosis, and a consequently neurological improvement was always obtained, even for the most serious of lesions except those at the thoracis level superior to T10. The application of this postero-lateral approach for severe spinal trauma seems to represent, in all cases of recent lesions, an alternative to the anterior or combined methods. We do not share the opinion that delay in decompression does not influence the neurological prognosis and emergency operation is advisable.

  12. Investigating an approach to the alliance based on interpersonal defense theory.

    PubMed

    Westerman, Michael A; Muran, J Christopher

    2017-09-01

    Notwithstanding consistent findings of significant relationships between the alliance and outcome, questions remain to be answered about the relatively small magnitude of those correlations, the mechanisms underlying the association, and how to conceptualize the alliance construct. We conducted a preliminary study of an approach to the alliance based on interpersonal defense theory, which is an interpersonal reconceptualization of defense processes, to investigate the promise of this alternative approach as a way to address the outstanding issues. We employed qualitative, theory-building case study methodology, closely examining alliance processes at four time points in the treatment of a case in terms of a case formulation based on interpersonal defense theory. The results suggested that our approach made it possible to recognize key processes in the alliance and that it helps explain how the alliance influences outcome. Our analyses also provided a rich set of concrete illustrations of the alliance phenomena identified by the theory. The findings suggest that an approach to the alliance based on interpersonal defense theory holds promise. However, although the qualitative method we employed has advantages, it also has limitations. We offer suggestions about how future qualitative and quantitative investigations could build on this study.

  13. Approach for computing 1D fracture density: application to fracture corridor characterization

    NASA Astrophysics Data System (ADS)

    Viseur, Sophie; Chatelée, Sebastien; Akriche, Clement; Lamarche, Juliette

    2016-04-01

    Fracture density is an important parameter for characterizing fractured reservoirs. Many stochastic simulation algorithms that generate fracture networks indeed rely on the determination of a fracture density on volumes (P30) to populate the reservoir zones with individual fracture surfaces. However, only 1D fracture density (P10) are available from subsurface data and it is then important to be able to accurately estimate this entity. In this paper, a novel approach is proposed to estimate fracture density from scan-line or well data. This method relies on regression, hypothesis testing and clustering techniques. The objective of the proposed approach is to highlight zones where fracture density are statistically very different or similar. This technique has been applied on both synthetic and real case studies. These studies concern fracture corridors, which are particular tectonic features that are generally difficult to characterize from subsurface data. These tectonic features are still not well known and studies must be conducted to better understand their internal spatial organization and variability. The presented synthetic cases aim at showing the ability of the approach to extract known features. The real case study illustrates how this approach allows the internal spatial organization of fracture corridors to be characterized.

  14. Effective Practices in the Delivery of Research Ethics Education: A Qualitative Review of Instructional Methods.

    PubMed

    Todd, E Michelle; Torrence, Brett S; Watts, Logan L; Mulhearn, Tyler J; Connelly, Shane; Mumford, Michael D

    2017-01-01

    In order to delineate best practices for courses on research ethics, the goal of the present effort was to identify themes related to instructional methods reflected in effective research ethics and responsible conduct of research (RCR) courses. By utilizing a qualitative review, four themes relevant to instructional methods were identified in effective research ethics courses: active participation, case-based activities, a combination of individual and group approaches, and a small number of instructional methods. Three instructional method themes associated with less effective courses were also identified: passive learning, a group-based approach, and a large number of instructional methods. Key characteristics of each theme, along with example courses relative to each theme, are described. Additionally, implications regarding these instructional method themes and recommendations for best practices in research ethics courses are discussed.

  15. Covariate Measurement Error Correction Methods in Mediation Analysis with Failure Time Data

    PubMed Central

    Zhao, Shanshan

    2014-01-01

    Summary Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This paper focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error and error associated with temporal variation. The underlying model with the ‘true’ mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling design. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. PMID:25139469

  16. Covariate measurement error correction methods in mediation analysis with failure time data.

    PubMed

    Zhao, Shanshan; Prentice, Ross L

    2014-12-01

    Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This article focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error, and error associated with temporal variation. The underlying model with the "true" mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling designs. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. © 2014, The International Biometric Society.

  17. Proposing a sequential comparative analysis for assessing multilateral health agency transformation and sustainable capacity: exploring the advantages of institutional theory.

    PubMed

    Gómez, Eduardo J

    2014-05-20

    This article proposes an approach to comparing and assessing the adaptive capacity of multilateral health agencies in meeting country and individual healthcare needs. Most studies comparing multilateral health agencies have failed to clearly propose a method for conducting agency comparisons. This study conducted a qualitative case study methodological approach, such that secondary and primary case study literature was used to conduct case study comparisons of multilateral health agencies. Through the proposed Sequential Comparative Analysis (SCA), the author found a more effective way to justify the selection of cases, compare and assess organizational transformative capacity, and to learn from agency success in policy sustainability processes. To more affectively understand and explain why some multilateral health agencies are more capable of adapting to country and individual healthcare needs, SCA provides a methodological approach that may help to better understand why these agencies are so different and what we can learn from successful reform processes. As funding challenges continue to hamper these agencies' adaptive capacity, learning from each other will become increasingly important.

  18. Using Videoconferencing to Deliver Individual Therapy and Pediatric Psychology Interventions with Children and Adolescents

    PubMed Central

    Patton, Susana

    2016-01-01

    Abstract Background: Because of the widening access gap between need for individual and pediatric psychology services and child specialist availability, secure videoconferencing options are more needed than ever to address access challenges across underserved settings. Methods: The authors summarize real-time videoconferencing evidence to date across individual therapy with children and pediatric psychology interventions using videoconferencing. The authors summarize emerging guidelines that inform best practices for individual child therapy over videoconferencing. Results: The authors present three case examples to illustrate best practices. The first behavioral pediatrics case summarizes evidence-based approaches in treating a rural young adolescent with attention-deficit/hyperactivity disorder (ADHD), oppositional defiant disorder (ODD), and hearing impairment. The second pediatric psychology case describes similarities and difference between on-site and videoconferencing services in treating a rural child with toileting concerns. The third adolescent case describes treatment of an urban honors student with depression. Conclusions: Videoconferencing is an effective approach to improving access to individual and pediatric psychology interventions for children and adolescents. Videoconferencing approaches are well accepted by families and show promise for disseminating evidence-based treatments to underserved communities. PMID:26745607

  19. The role of ethics in information technology decisions: a case-based approach to biomedical informatics education.

    PubMed

    Anderson, James G

    2004-03-18

    The purpose of this paper is to propose a case-based approach to instruction regarding ethical issues raised by the use of information technology (IT) in healthcare. These issues are rarely addressed in graduate degree and continuing professional education programs in health informatics. There are important reasons why ethical issues need to be addressed in informatics training. Ethical issues raised by the introduction of information technology affect practice and are ubiquitous. These issues are frequently among the most challenging to young practitioners who are ill prepared to deal with them in practice. First, the paper provides an overview of methods of moral reasoning that can be used to identify and analyze ethical problems in health informatics. Second, we provide a framework for defining cases that involve ethical issues and outline major issues raised by the use of information technology. Specific cases are used as examples of new dilemmas that are posed by the introduction of information technology in healthcare. These cases are used to illustrate how ethics can be integrated with the other elements of informatics training. The cases discussed here reflect day-to-day situations that arise in health settings that require decisions. Third, an approach that can be used to teach ethics in health informatics programs is outlined and illustrated.

  20. Masked Visual Analysis: Minimizing Type I Error in Visually Guided Single-Case Design for Communication Disorders.

    PubMed

    Byun, Tara McAllister; Hitchcock, Elaine R; Ferron, John

    2017-06-10

    Single-case experimental designs are widely used to study interventions for communication disorders. Traditionally, single-case experiments follow a response-guided approach, where design decisions during the study are based on participants' observed patterns of behavior. However, this approach has been criticized for its high rate of Type I error. In masked visual analysis (MVA), response-guided decisions are made by a researcher who is blinded to participants' identities and treatment assignments. MVA also makes it possible to conduct a hypothesis test assessing the significance of treatment effects. This tutorial describes the principles of MVA, including both how experiments can be set up and how results can be used for hypothesis testing. We then report a case study showing how MVA was deployed in a multiple-baseline across-subjects study investigating treatment for residual errors affecting rhotics. Strengths and weaknesses of MVA are discussed. Given their important role in the evidence base that informs clinical decision making, it is critical for single-case experimental studies to be conducted in a way that allows researchers to draw valid inferences. As a method that can increase the rigor of single-case studies while preserving the benefits of a response-guided approach, MVA warrants expanded attention from researchers in communication disorders.

  1. Mining Available Data from the United States Environmental Protection Agency to Support Rapid Life Cycle Inventory Modeling of Chemical Manufacturing.

    PubMed

    Cashman, Sarah A; Meyer, David E; Edelen, Ashley N; Ingwersen, Wesley W; Abraham, John P; Barrett, William M; Gonzalez, Michael A; Randall, Paul M; Ruiz-Mercado, Gerardo; Smith, Raymond L

    2016-09-06

    Demands for quick and accurate life cycle assessments create a need for methods to rapidly generate reliable life cycle inventories (LCI). Data mining is a suitable tool for this purpose, especially given the large amount of available governmental data. These data are typically applied to LCIs on a case-by-case basis. As linked open data becomes more prevalent, it may be possible to automate LCI using data mining by establishing a reproducible approach for identifying, extracting, and processing the data. This work proposes a method for standardizing and eventually automating the discovery and use of publicly available data at the United States Environmental Protection Agency for chemical-manufacturing LCI. The method is developed using a case study of acetic acid. The data quality and gap analyses for the generated inventory found that the selected data sources can provide information with equal or better reliability and representativeness on air, water, hazardous waste, on-site energy usage, and production volumes but with key data gaps including material inputs, water usage, purchased electricity, and transportation requirements. A comparison of the generated LCI with existing data revealed that the data mining inventory is in reasonable agreement with existing data and may provide a more-comprehensive inventory of air emissions and water discharges. The case study highlighted challenges for current data management practices that must be overcome to successfully automate the method using semantic technology. Benefits of the method are that the openly available data can be compiled in a standardized and transparent approach that supports potential automation with flexibility to incorporate new data sources as needed.

  2. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    NASA Astrophysics Data System (ADS)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  3. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information

    PubMed Central

    Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-01-01

    Abstract This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non‐fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation. PMID:27840456

  4. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information

    NASA Astrophysics Data System (ADS)

    Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-09-01

    This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.

  5. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information.

    PubMed

    Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-09-01

    This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.

  6. Comparison and combination of "direct" and fragment based local correlation methods: Cluster in molecules and domain based local pair natural orbital perturbation and coupled cluster theories

    NASA Astrophysics Data System (ADS)

    Guo, Yang; Becker, Ute; Neese, Frank

    2018-03-01

    Local correlation theories have been developed in two main flavors: (1) "direct" local correlation methods apply local approximation to the canonical equations and (2) fragment based methods reconstruct the correlation energy from a series of smaller calculations on subsystems. The present work serves two purposes. First, we investigate the relative efficiencies of the two approaches using the domain-based local pair natural orbital (DLPNO) approach as the "direct" method and the cluster in molecule (CIM) approach as the fragment based approach. Both approaches are applied in conjunction with second-order many-body perturbation theory (MP2) as well as coupled-cluster theory with single-, double- and perturbative triple excitations [CCSD(T)]. Second, we have investigated the possible merits of combining the two approaches by performing CIM calculations with DLPNO methods serving as the method of choice for performing the subsystem calculations. Our cluster-in-molecule approach is closely related to but slightly deviates from approaches in the literature since we have avoided real space cutoffs. Moreover, the neglected distant pair correlations in the previous CIM approach are considered approximately. Six very large molecules (503-2380 atoms) were studied. At both MP2 and CCSD(T) levels of theory, the CIM and DLPNO methods show similar efficiency. However, DLPNO methods are more accurate for 3-dimensional systems. While we have found only little incentive for the combination of CIM with DLPNO-MP2, the situation is different for CIM-DLPNO-CCSD(T). This combination is attractive because (1) the better parallelization opportunities offered by CIM; (2) the methodology is less memory intensive than the genuine DLPNO-CCSD(T) method and, hence, allows for large calculations on more modest hardware; and (3) the methodology is applicable and efficient in the frequently met cases, where the largest subsystem calculation is too large for the canonical CCSD(T) method.

  7. [Application and case analysis on the problem-based teaching of Jingluo Shuxue Xue (Science of Meridian and Acupoint) in reference to the team oriented learning method].

    PubMed

    Ma, Ruijie; Lin, Xianming

    2015-12-01

    The problem based teaching (PBT) has been the main approach to the training in the universities o the world. Combined with the team oriented learning method, PBT will become the method available to the education in medical universities. In the paper, based on the common questions in teaching Jingluo Shuxue Xue (Science of Meridian and Acupoint), the concepts and characters of PBT and the team oriented learning method were analyzed. The implementation steps of PBT were set up in reference to the team oriented learning method. By quoting the original text in Beiji Qianjin Yaofang (Essential recipes for emergent use worth a thousand gold), the case analysis on "the thirteen devil points" was established with PBT.

  8. Modeling and E-M estimation of haplotype-specific relative risks from genotype data for a case-control study of unrelated individuals.

    PubMed

    Stram, Daniel O; Leigh Pearce, Celeste; Bretsky, Phillip; Freedman, Matthew; Hirschhorn, Joel N; Altshuler, David; Kolonel, Laurence N; Henderson, Brian E; Thomas, Duncan C

    2003-01-01

    The US National Cancer Institute has recently sponsored the formation of a Cohort Consortium (http://2002.cancer.gov/scpgenes.htm) to facilitate the pooling of data on very large numbers of people, concerning the effects of genes and environment on cancer incidence. One likely goal of these efforts will be generate a large population-based case-control series for which a number of candidate genes will be investigated using SNP haplotype as well as genotype analysis. The goal of this paper is to outline the issues involved in choosing a method of estimating haplotype-specific risk estimates for such data that is technically appropriate and yet attractive to epidemiologists who are already comfortable with odds ratios and logistic regression. Our interest is to develop and evaluate extensions of methods, based on haplotype imputation, that have been recently described (Schaid et al., Am J Hum Genet, 2002, and Zaykin et al., Hum Hered, 2002) as providing score tests of the null hypothesis of no effect of SNP haplotypes upon risk, which may be used for more complex tasks, such as providing confidence intervals, and tests of equivalence of haplotype-specific risks in two or more separate populations. In order to do so we (1) develop a cohort approach towards odds ratio analysis by expanding the E-M algorithm to provide maximum likelihood estimates of haplotype-specific odds ratios as well as genotype frequencies; (2) show how to correct the cohort approach, to give essentially unbiased estimates for population-based or nested case-control studies by incorporating the probability of selection as a case or control into the likelihood, based on a simplified model of case and control selection, and (3) finally, in an example data set (CYP17 and breast cancer, from the Multiethnic Cohort Study) we compare likelihood-based confidence interval estimates from the two methods with each other, and with the use of the single-imputation approach of Zaykin et al. applied under both null and alternative hypotheses. We conclude that so long as haplotypes are well predicted by SNP genotypes (we use the Rh2 criteria of Stram et al. [1]) the differences between the three methods are very small and in particular that the single imputation method may be expected to work extremely well. Copyright 2003 S. Karger AG, Basel

  9. Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework

    NASA Astrophysics Data System (ADS)

    Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.

    2016-03-01

    A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.

  10. An intelligent case-adjustment algorithm for the automated design of population-based quality auditing protocols.

    PubMed

    Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A

    2004-01-01

    We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.

  11. Mexican-American Males Providing Personal Care for their Mothers

    PubMed Central

    Evans, Bronwynne C.; Belyea, Michael J.; Ume, Ebere

    2011-01-01

    We know little about Mexican-American (MA) family adaptation to critical events in the informal caregiving experience but, in these days of economic and social turmoil, sons must sometimes step up to provide personal care for their aging mothers. This article compares two empirically real cases of MA males who provided such care, in lieu of a female relative. The cases are selected from a federally-funded, descriptive, longitudinal, mixed methods study of 110 MA caregivers and their care recipients. In case-oriented research, investigators can generate propositions (connected sets of statements) that reflect their findings and conclusions, and can be tested against subsequent cases: Caregiving strain and burden in MA males may have more to do with physical and emotional costs than financial ones; MA males providing personal care for their mothers adopt a matter-of-fact approach as they act “against taboo”; and this approach is a new way to fulfill family obligations. PMID:21643486

  12. Application of the unsteady vortex-lattice method to the nonlinear two-degree-of-freedom aeroelastic equations

    NASA Technical Reports Server (NTRS)

    Strganac, T. W.; Mook, D. T.

    1986-01-01

    A means of numerically simulating flutter is established by implementing a predictor-corrector algorithm to solve the equations of motion. Aerodynamic loads are provided by the unsteady vortex lattice method (UVLM). This method is illustrated via the obtainment of stable and unstable responses to initial disturbances in the case of two-degree-of-freedom motion. It was found that for some angles of attack and dynamic pressure, the initial disturbance decays, for others it grows (flutter). When flutter occurs, the solution yields the amplitude and period of the resulting limit cycle. The preliminaray results attest to the feasibility of this method for studying flutter in cases that would be difficult to treat using a classical approach.

  13. Graph reconstruction using covariance-based methods.

    PubMed

    Sulaimanov, Nurgazy; Koeppl, Heinz

    2016-12-01

    Methods based on correlation and partial correlation are today employed in the reconstruction of a statistical interaction graph from high-throughput omics data. These dedicated methods work well even for the case when the number of variables exceeds the number of samples. In this study, we investigate how the graphs extracted from covariance and concentration matrix estimates are related by using Neumann series and transitive closure and through discussing concrete small examples. Considering the ideal case where the true graph is available, we also compare correlation and partial correlation methods for large realistic graphs. In particular, we perform the comparisons with optimally selected parameters based on the true underlying graph and with data-driven approaches where the parameters are directly estimated from the data.

  14. Toward the detection of gravitational waves under non-Gaussian noises I. Locally optimal statistic.

    PubMed

    Yokoyama, Jun'ichi

    2014-01-01

    After reviewing the standard hypothesis test and the matched filter technique to identify gravitational waves under Gaussian noises, we introduce two methods to deal with non-Gaussian stationary noises. We formulate the likelihood ratio function under weakly non-Gaussian noises through the Edgeworth expansion and strongly non-Gaussian noises in terms of a new method we call Gaussian mapping where the observed marginal distribution and the two-body correlation function are fully taken into account. We then apply these two approaches to Student's t-distribution which has a larger tails than Gaussian. It is shown that while both methods work well in the case the non-Gaussianity is small, only the latter method works well for highly non-Gaussian case.

  15. Overview of psychiatric ethics IV: the method of casuistry.

    PubMed

    Robertson, Michael; Ryan, Christopher; Walter, Garry

    2007-08-01

    The aim of this paper is to describe the method of ethical analysis known as casuistry and consider its merits as a basis of ethical deliberation in psychiatry. Casuistry approximates the legal arguments of common law. It examines ethical dilemmas by adopting a taxonomic approach to 'paradigm' cases, using a technique akin to that of normative analogical reasoning. Casuistry offers a useful method in ethical reasoning through providing a practical means of evaluating the merits of a particular course of action in a particular clinical situation. As a method ethical moral reasoning in psychiatry, casuistry suffers from a paucity of paradigm cases and its failure to fully contextualize ethical dilemmas by relying on common morality theory as its basis.

  16. Towards a precise test for malaria diagnosis in the Brazilian Amazon: comparison among field microscopy, a rapid diagnostic test, nested PCR, and a computational expert system based on artificial neural networks

    PubMed Central

    2010-01-01

    Background Accurate malaria diagnosis is mandatory for the treatment and management of severe cases. Moreover, individuals with asymptomatic malaria are not usually screened by health care facilities, which further complicates disease control efforts. The present study compared the performances of a malaria rapid diagnosis test (RDT), the thick blood smear method and nested PCR for the diagnosis of symptomatic malaria in the Brazilian Amazon. In addition, an innovative computational approach was tested for the diagnosis of asymptomatic malaria. Methods The study was divided in two parts. For the first part, passive case detection was performed in 311 individuals with malaria-related symptoms from a recently urbanized community in the Brazilian Amazon. A cross-sectional investigation compared the diagnostic performance of the RDT Optimal-IT, nested PCR and light microscopy. The second part of the study involved active case detection of asymptomatic malaria in 380 individuals from riverine communities in Rondônia, Brazil. The performances of microscopy, nested PCR and an expert computational system based on artificial neural networks (MalDANN) using epidemiological data were compared. Results Nested PCR was shown to be the gold standard for diagnosis of both symptomatic and asymptomatic malaria because it detected the major number of cases and presented the maximum specificity. Surprisingly, the RDT was superior to microscopy in the diagnosis of cases with low parasitaemia. Nevertheless, RDT could not discriminate the Plasmodium species in 12 cases of mixed infections (Plasmodium vivax + Plasmodium falciparum). Moreover, the microscopy presented low performance in the detection of asymptomatic cases (61.25% of correct diagnoses). The MalDANN system using epidemiological data was worse that the light microscopy (56% of correct diagnoses). However, when information regarding plasma levels of interleukin-10 and interferon-gamma were inputted, the MalDANN performance sensibly increased (80% correct diagnoses). Conclusions An RDT for malaria diagnosis may find a promising use in the Brazilian Amazon integrating a rational diagnostic approach. Despite the low performance of the MalDANN test using solely epidemiological data, an approach based on neural networks may be feasible in cases where simpler methods for discriminating individuals below and above threshold cytokine levels are available. PMID:20459613

  17. A robust multifactor dimensionality reduction method for detecting gene-gene interactions with application to the genetic analysis of bladder cancer susceptibility

    PubMed Central

    Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.

    2010-01-01

    A central goal of human genetics is to identify and characterize susceptibility genes for common complex human diseases. An important challenge in this endeavor is the modeling of gene-gene interaction or epistasis that can result in non-additivity of genetic effects. The multifactor dimensionality reduction (MDR) method was developed as machine learning alternative to parametric logistic regression for detecting interactions in absence of significant marginal effects. The goal of MDR is to reduce the dimensionality inherent in modeling combinations of polymorphisms using a computational approach called constructive induction. Here, we propose a Robust Multifactor Dimensionality Reduction (RMDR) method that performs constructive induction using a Fisher’s Exact Test rather than a predetermined threshold. The advantage of this approach is that only those genotype combinations that are determined to be statistically significant are considered in the MDR analysis. We use two simulation studies to demonstrate that this approach will increase the success rate of MDR when there are only a few genotype combinations that are significantly associated with case-control status. We show that there is no loss of success rate when this is not the case. We then apply the RMDR method to the detection of gene-gene interactions in genotype data from a population-based study of bladder cancer in New Hampshire. PMID:21091664

  18. Augmented design and analysis of computer experiments: a novel tolerance embedded global optimization approach applied to SWIR hyperspectral illumination design.

    PubMed

    Keresztes, Janos C; John Koshel, R; D'huys, Karlien; De Ketelaere, Bart; Audenaert, Jan; Goos, Peter; Saeys, Wouter

    2016-12-26

    A novel meta-heuristic approach for minimizing nonlinear constrained problems is proposed, which offers tolerance information during the search for the global optimum. The method is based on the concept of design and analysis of computer experiments combined with a novel two phase design augmentation (DACEDA), which models the entire merit space using a Gaussian process, with iteratively increased resolution around the optimum. The algorithm is introduced through a series of cases studies with increasing complexity for optimizing uniformity of a short-wave infrared (SWIR) hyperspectral imaging (HSI) illumination system (IS). The method is first demonstrated for a two-dimensional problem consisting of the positioning of analytical isotropic point sources. The method is further applied to two-dimensional (2D) and five-dimensional (5D) SWIR HSI IS versions using close- and far-field measured source models applied within the non-sequential ray-tracing software FRED, including inherent stochastic noise. The proposed method is compared to other heuristic approaches such as simplex and simulated annealing (SA). It is shown that DACEDA converges towards a minimum with 1 % improvement compared to simplex and SA, and more importantly requiring only half the number of simulations. Finally, a concurrent tolerance analysis is done within DACEDA for to the five-dimensional case such that further simulations are not required.

  19. Genome-wide association tests of inversions with application to psoriasis

    PubMed Central

    Ma, Jianzhong; Xiong, Momiao; You, Ming; Lozano, Guillermina; Amos, Christopher I.

    2014-01-01

    Although inversions have occasionally been found to be associated with disease susceptibility through interrupting a gene or its regulatory region, or by increasing the risk for deleterious secondary rearrangements, no association study has been specifically conducted for risks associated with inversions, mainly because existing approaches to detecting and genotyping inversions do not readily scale to a large number of samples. Based on our recently proposed approach to identifying and genotyping inversions using principal components analysis (PCA), we herein develop a method of detecting association between inversions and disease in a genome-wide fashion. Our method uses genotype data for single nucleotide polymorphisms (SNPs), and is thus cost-efficient and computationally fast. For an inversion polymorphism, local PCA around the inversion region is performed to infer the inversion genotypes of all samples. For many inversions, we found that some of the SNPs inside an inversion region are fixed in the two lineages of different orientations and thus can serve as surrogate markers. Our method can be applied to case-control and quantitative trait association studies to identify inversions that may interrupt a gene or the connection between a gene and its regulatory agents. Our method also offers a new venue to identify inversions that are responsible for disease-causing secondary rearrangements. We illustrated our proposed approach to case-control data for psoriasis and identified novel associations with a few inversion polymorphisms. PMID:24623382

  20. Finite-element solution to multidimensional multisource electromagnetic problems in the frequency domain using non-conforming meshes

    NASA Astrophysics Data System (ADS)

    Soloveichik, Yury G.; Persova, Marina G.; Domnikov, Petr A.; Koshkina, Yulia I.; Vagin, Denis V.

    2018-03-01

    We propose an approach to solving multisource induction logging problems in multidimensional media. According to the type of induction logging tools, the measurements are performed in the frequency range of 10 kHz to 14 MHz, transmitter-receiver offsets vary in the range of 0.5-8 m or more, and the trajectory length is up to 1 km. For calculating the total field, the primary-secondary field approach is used. The secondary field is calculated with the use of the finite-element method (FEM), irregular non-conforming meshes with local refinements and a direct solver. The approach to constructing basis functions with the continuous tangential components (from Hcurl(Ω)) on the non-conforming meshes from the standard shape vector functions is developed. On the basis of this method, the algorithm of generating global matrices and a vector of the finite-element equation system is proposed. We also propose the method of grouping the logging tool positions, which makes it possible to significantly increase the computational effectiveness. This is achieved due to the compromise between the possibility of using the 1-D background medium, which is very similar to the investigated multidimensional medium for a small group, and the decrease in the number of the finite-element matrix factorizations with the increasing number of tool positions in one group. For calculating the primary field, we propose the method based on the use of FEM. This method is highly effective when the 1-D field is required to be calculated at a great number of points. The use of this method significantly increases the effectiveness of the primary-secondary field approach. The proposed approach makes it possible to perform modelling both in the 2.5-D case (i.e. without taking into account a borehole and/or invasion zone effect) and the 3-D case (i.e. for models with a borehole and invasion zone). The accuracy of numerical results obtained with the use of the proposed approach is compared with the one obtained by other codes for 1-D and 3-D anisotropic models. The results of this comparison lend support to the validity of our code. We also present the numerical results proving greater effectiveness of the finite-element approach proposed for calculating the 1-D field in comparison with the known codes implementing the semi-analytical methods for the case in which the field is calculated at a large number of points. Additionally, we present the numerical results which confirm the accuracy advantages of the automatic choice of a background medium for calculating the 1-D field as well as the results of 2.5-D modelling for a geoelectrical model with anisotropic layers, a fault and long tool-movement trajectory with the varying dip angle.

Top