Sample records for general methodological framework

  1. Notes on a General Framework for Observed Score Equating. Research Report. ETS RR-08-59

    ERIC Educational Resources Information Center

    Moses, Tim; Holland, Paul

    2008-01-01

    The purpose of this paper is to extend von Davier, Holland, and Thayer's (2004b) framework of kernel equating so that it can incorporate raw data and traditional equipercentile equating methods. One result of this more general framework is that previous equating methodology research can be viewed more comprehensively. Another result is that the…

  2. Quantifying biopsychosocial aspects in everyday contexts: an integrative methodological approach from the behavioral sciences

    PubMed Central

    Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K

    2015-01-01

    Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708

  3. Integrated corridor management analysis, modeling and simulation (AMS) methodology.

    DOT National Transportation Integrated Search

    2008-03-01

    This AMS Methodologies Document provides a discussion of potential ICM analytical approaches for the assessment of generic corridor operations. The AMS framework described in this report identifies strategies and procedures for tailoring AMS general ...

  4. A Comparison between Linear IRT Observed-Score Equating and Levine Observed-Score Equating under the Generalized Kernel Equating Framework

    ERIC Educational Resources Information Center

    Chen, Haiwen

    2012-01-01

    In this article, linear item response theory (IRT) observed-score equating is compared under a generalized kernel equating framework with Levine observed-score equating for nonequivalent groups with anchor test design. Interestingly, these two equating methods are closely related despite being based on different methodologies. Specifically, when…

  5. Towards A Topological Framework for Integrating Semantic Information Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael

    2014-09-07

    In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.

  6. Quasi interpolation with Voronoi splines.

    PubMed

    Mirzargar, Mahsa; Entezari, Alireza

    2011-12-01

    We present a quasi interpolation framework that attains the optimal approximation-order of Voronoi splines for reconstruction of volumetric data sampled on general lattices. The quasi interpolation framework of Voronoi splines provides an unbiased reconstruction method across various lattices. Therefore this framework allows us to analyze and contrast the sampling-theoretic performance of general lattices, using signal reconstruction, in an unbiased manner. Our quasi interpolation methodology is implemented as an efficient FIR filter that can be applied online or as a preprocessing step. We present visual and numerical experiments that demonstrate the improved accuracy of reconstruction across lattices, using the quasi interpolation framework. © 2011 IEEE

  7. Framework for the Parametric System Modeling of Space Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II

    2008-01-01

    This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.

  8. Methodological and Epistemological Considerations in Utilizing Qualitative Inquiry to Develop Interventions.

    PubMed

    Duggleby, Wendy; Williams, Allison

    2016-01-01

    The purpose of this article is to discuss methodological and epistemological considerations involved in using qualitative inquiry to develop interventions. These considerations included (a) using diverse methodological approaches and (b) epistemological considerations such as generalization, de-contextualization, and subjective reality. Diverse methodological approaches have the potential to inform different stages of intervention development. Using the development of a psychosocial hope intervention for advanced cancer patients as an example, the authors utilized a thematic study to assess current theories/frameworks and interventions. However, to understand the processes that the intervention needed to target to affect change, grounded theory was used. Epistemological considerations provided a framework to understand and, further, critique the intervention. Using diverse qualitative methodological approaches and examining epistemological considerations were useful in developing an intervention that appears to foster hope in patients with advanced cancer. © The Author(s) 2015.

  9. Meta-Analysis of Coefficient Alpha

    ERIC Educational Resources Information Center

    Rodriguez, Michael C.; Maeda, Yukiko

    2006-01-01

    The meta-analysis of coefficient alpha across many studies is becoming more common in psychology by a methodology labeled reliability generalization. Existing reliability generalization studies have not used the sampling distribution of coefficient alpha for precision weighting and other common meta-analytic procedures. A framework is provided for…

  10. A Conceptual Framework to Help Evaluate the Quality of Institutional Performance

    ERIC Educational Resources Information Center

    Kettunen, Juha

    2008-01-01

    Purpose: This study aims to present a general conceptual framework which can be used to evaluate quality and institutional performance in higher education. Design/methodology/approach: The quality of higher education is at the heart of the setting up of the European Higher Education Area. Strategic management is widely used in higher education…

  11. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  12. Methodological convergence of program evaluation designs.

    PubMed

    Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2014-01-01

    Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.

  13. A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems.

    PubMed

    Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong

    2015-02-01

    Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.

  14. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    PubMed

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  15. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission—With an application to the 2014-2015 West Africa Ebola outbreak

    PubMed Central

    McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.

    2017-01-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216

  16. A mechanistic spatio-temporal framework for modelling individual-to-individual transmission-With an application to the 2014-2015 West Africa Ebola outbreak.

    PubMed

    Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T

    2017-10-01

    In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.

  17. Modeling and Reduction With Applications to Semiconductor Processing

    DTIC Science & Technology

    1999-01-01

    smoothies ,” as they kept my energy level high without resorting to coffee (the beverage of choice, it seems, for graduate students). My advisor gave me all...with POC data, and balancing approach. . . . . . . . . . . . . . . . 312 xii LIST OF FIGURES 1.1 General state-space model reduction methodology ...reduction problem, then, is one of finding a systematic methodology within a given mathematical framework to produce an efficient or optimal trade-off of

  18. IMSF: Infinite Methodology Set Framework

    NASA Astrophysics Data System (ADS)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  19. Making a Map of Science: General Systems Theory as a Conceptual Framework for Tertiary Science Education.

    ERIC Educational Resources Information Center

    Gulyaev, Sergei A.; Stonyer, Heather R.

    2002-01-01

    Develops an integrated approach based on the use of general systems theory (GST) and the concept of 'mapping' scientific knowledge to provide students with tools for a more holistic understanding of science. Uses GST as the core methodology for understanding science and its complexity. Discusses the role of scientific community in producing…

  20. Off-Road Mobility Research

    DTIC Science & Technology

    1967-09-01

    Lewandowski, Thomas R. Magorian, H. T. McAdams, James N. Naylor, Walter F. Wood -ii- VJ-2330-G-2 Section 6 Stephen C. Cowin, Vito De Palma, Patrick M. Miller...providing detailed inputs to a)). 2. The establishing of the general framework for the Phenomenological Model. 3. A prelim.na ry methodology study using the...of current practice in mathematical modeling of vehicle-terrain systems. 2) The establishing of the framework for a vehicle-terrain dynamics model as

  1. Revealing, Reinterpreting, Rewriting Mujeres

    ERIC Educational Resources Information Center

    Preuss, Cara Lynne; Saavedra, Cinthya M.

    2014-01-01

    This paper reanalyzed research previously conducted with Spanish-speaking childcare providers who participated in an educational literacy program. The women in the program were generally framed as the deficient other--illiterate, immigrant women. The authors used a critical framework and Chicana/Latina feminist methodologies, namely "pláticas…

  2. How equity is addressed in clinical practice guidelines: a content analysis

    PubMed Central

    Shi, Chunhu; Tian, Jinhui; Wang, Quan; Petkovic, Jennifer; Ren, Dan; Yang, Kehu; Yang, Yang

    2014-01-01

    Objectives Considering equity into guidelines presents methodological challenges. This study aims to qualitatively synthesise the methods for incorporating equity in clinical practice guidelines (CPGs). Setting Content analysis of methodological publications. Eligibility criteria for selecting studies Methodological publications were included if they provided checklists/frameworks on when, how and to what extent equity should be incorporated in CPGs. Data sources We electronically searched MEDLINE, retrieved references, and browsed guideline development organisation websites from inception to January 2013. After study selection by two authors, general characteristics and checklists items/framework components from included studies were extracted. Based on the questions or items from checklists/frameworks (unit of analysis), content analysis was conducted to identify themes and questions/items were grouped into these themes. Primary outcomes The primary outcomes were methodological themes and processes on how to address equity issues in guideline development. Results 8 studies with 10 publications were included from 3405 citations. In total, a list of 87 questions/items was generated from 17 checklists/frameworks. After content analysis, questions were grouped into eight themes (‘scoping questions’, ‘searching relevant evidence’, ‘appraising evidence and recommendations’, ‘formulating recommendations’, ‘monitoring implementation’, ‘providing a flow chart to include equity in CPGs’, and ‘others: reporting of guidelines and comments from stakeholders’ for CPG developers and ‘assessing the quality of CPGs’ for CPG users). Four included studies covered more than five of these themes. We also summarised the process of guideline development based on the themes mentioned above. Conclusions For disadvantaged population-specific CPGs, eight important methodological issues identified in this review should be considered when including equity in CPGs under the guidance of a scientific guideline development manual. PMID:25479795

  3. Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yau, M.; Motamed, M.; Guarro, S.

    2006-07-01

    Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less

  4. Pricing foreign equity option with stochastic volatility

    NASA Astrophysics Data System (ADS)

    Sun, Qi; Xu, Weidong

    2015-11-01

    In this paper we propose a general foreign equity option pricing framework that unifies the vast foreign equity option pricing literature and incorporates the stochastic volatility into foreign equity option pricing. Under our framework, the time-changed Lévy processes are used to model the underlying assets price of foreign equity option and the closed form pricing formula is obtained through the use of characteristic function methodology. Numerical tests indicate that stochastic volatility has a dramatic effect on the foreign equity option prices.

  5. Clarifying the Conceptualization, Dimensionality, and Structure of Emotion: Response to Barrett and Colleagues

    PubMed Central

    Cowen, Alan S.; Keltner, Dacher

    2018-01-01

    We present a mathematically based framework distinguishing the dimensionality, structure, and conceptualization of emotion-related responses. Our recent findings indicate that reported emotional experience is highdimensional, involves gradients between categories traditionally thought of as discrete (e.g., ‘fear’, ‘disgust’), and cannot be reduced to widely used domain-general scales (valence, arousal, etc.). In light of our conceptual framework and findings, we address potential methodological and conceptual confusions in Barrett and colleagues’ commentary on our work. PMID:29477775

  6. PSYCHOLOGICAL CONCEPTIONS OF TEACHING.

    ERIC Educational Resources Information Center

    GAGE, N.L.

    A CONCEPTUAL FRAMEWORK WAS PROPOSED FOR AN EDUCATIONAL PSYCHOLOGY COURSE IN THE GENERAL METHODOLOGY OF TEACHING. THIS COURSE WOULD TRANSCEND THE SPECIAL REQUIREMENTS OF ANY GIVEN SUBJECT MATTER OR GRADE LEVEL AND SERVE AS THE BASIS FOR DERIVING THE SPECIAL METHODS OF TEACHING THAT WOULD APPLY TO ANY PARTICULAR GRADE LEVEL OR SUBJECT MATTER.…

  7. Impact of reconstruction strategies on system performance measures : maximizing safety and mobility while minimizing life-cycle costs : final report, December 8, 2008.

    DOT National Transportation Integrated Search

    2008-12-08

    The objective of this research is to develop a general methodological framework for planning and : evaluating the effectiveness of highway reconstruction strategies on the systems performance : measures, in particular safety, mobility, and the tot...

  8. Kindergarten Curriculum for Children with Hearing Impairments: Jordanian Teachers' Perspectives

    ERIC Educational Resources Information Center

    Al-Zboon, Eman

    2016-01-01

    This study describes a kindergarten curriculum for children with hearing impairments, from their teachers' perspectives. Qualitative research data from interviews with 20 teachers were analysed using content analysis methodology. The results pinpoint a collection of proposed curriculum components (i.e. a general framework and outcomes document;…

  9. Role Management in a Privacy-Enhanced Collaborative Environment

    ERIC Educational Resources Information Center

    Lorenz, Anja; Borcea-Pfitzmann, Katrin

    2010-01-01

    Purpose: Facing the dilemma between collaboration and privacy is a continual challenge for users. In this setting, the purpose of this paper is to discuss issues of a highly flexible role management integrated in a privacy-enhanced collaborative environment (PECE). Design/methodology/approach: The general framework was provided by former findings…

  10. Researching "With", Not "On": Engaging Marginalised Learners in the Research Process

    ERIC Educational Resources Information Center

    Atkins, Liz

    2013-01-01

    This paper discusses practical and methodological issues arising from a case study exploring the hopes, aspirations and learning identities of three groups of students undertaking low-level broad vocational programmes in two English general further education colleges. Working within a social justice theoretical framework the paper outlines the…

  11. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  12. Microgravity isolation system design: A modern control synthesis framework

    NASA Technical Reports Server (NTRS)

    Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.

    1994-01-01

    Manned orbiters will require active vibration isolation for acceleration-sensitive microgravity science experiments. Since umbilicals are highly desirable or even indispensable for many experiments, and since their presence greatly affects the complexity of the isolation problem, they should be considered in control synthesis. In this paper a general framework is presented for applying extended H2 synthesis methods to the three-dimensional microgravity isolation problem. The methodology integrates control and state frequency weighting and input and output disturbance accommodation techniques into the basic H2 synthesis approach. The various system models needed for design and analysis are also presented. The paper concludes with a discussion of a general design philosophy for the microgravity vibration isolation problem.

  13. Microgravity isolation system design: A modern control synthesis framework

    NASA Technical Reports Server (NTRS)

    Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.

    1994-01-01

    Manned orbiters will require active vibration isolation for acceleration-sensitive microgravity science experiments. Since umbilicals are highly desirable or even indispensable for many experiments, and since their presence greatly affects the complexity of the isolation problem, they should be considered in control synthesis. A general framework is presented for applying extended H2 synthesis methods to the three-dimensional microgravity isolation problem. The methodology integrates control and state frequency weighting and input and output disturbance accommodation techniques into the basic H2 synthesis approach. The various system models needed for design and analysis are also presented. The paper concludes with a discussion of a general design philosophy for the microgravity vibration isolation problem.

  14. Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Diskin, Boris

    2012-01-01

    A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.

  15. A movement ecology paradigm for unifying organismal movement research

    PubMed Central

    Nathan, Ran; Getz, Wayne M.; Revilla, Eloy; Holyoak, Marcel; Kadmon, Ronen; Saltz, David; Smouse, Peter E.

    2008-01-01

    Movement of individual organisms is fundamental to life, quilting our planet in a rich tapestry of phenomena with diverse implications for ecosystems and humans. Movement research is both plentiful and insightful, and recent methodological advances facilitate obtaining a detailed view of individual movement. Yet, we lack a general unifying paradigm, derived from first principles, which can place movement studies within a common context and advance the development of a mature scientific discipline. This introductory article to the Movement Ecology Special Feature proposes a paradigm that integrates conceptual, theoretical, methodological, and empirical frameworks for studying movement of all organisms, from microbes to trees to elephants. We introduce a conceptual framework depicting the interplay among four basic mechanistic components of organismal movement: the internal state (why move?), motion (how to move?), and navigation (when and where to move?) capacities of the individual and the external factors affecting movement. We demonstrate how the proposed framework aids the study of various taxa and movement types; promotes the formulation of hypotheses about movement; and complements existing biomechanical, cognitive, random, and optimality paradigms of movement. The proposed framework integrates eclectic research on movement into a structured paradigm and aims at providing a basis for hypothesis generation and a vehicle facilitating the understanding of the causes, mechanisms, and spatiotemporal patterns of movement and their role in various ecological and evolutionary processes. ”Now we must consider in general the common reason for moving with any movement whatever.“ (Aristotle, De Motu Animalium, 4th century B.C.) PMID:19060196

  16. General Framework for Effect Sizes in Cluster Randomized Experiments

    ERIC Educational Resources Information Center

    VanHoudnos, Nathan

    2016-01-01

    Cluster randomized experiments are ubiquitous in modern education research. Although a variety of modeling approaches are used to analyze these data, perhaps the most common methodology is a normal mixed effects model where some effects, such as the treatment effect, are regarded as fixed, and others, such as the effect of group random assignment…

  17. Education and Modernization of Micronesia: A Case Study in Development and Development Planning.

    ERIC Educational Resources Information Center

    Pearse, Richard; Bezanson, Keith A.

    The case study examined the development of an overall education plan for the Trust Territory of the Pacific Islands. The methodology of multidisciplinary education planning through the use of general comparative analysis models of political, economic, and social development is explained: Almond and Powell's framework for the analysis of political…

  18. Leading though Language Learning and Teaching: The Case of Gandhi

    ERIC Educational Resources Information Center

    Eaton, Sarah Elaine

    2010-01-01

    Purpose: The purpose of this paper is to link the notions of language learning and leadership, using Gandhi as a case study. Theoretical Framework: This work is studied through a constructivist lens, and is further influenced by Educational Leadership thinkers such as Michael Fullan (2006). Methodology: Choosing a broad general theme of interest…

  19. Controversies in the Evaluation of Compensatory Education

    ERIC Educational Resources Information Center

    McLaughlin, Donald H.; And Others

    This document answers the question, "What has been learned about evaluation methodology from the decade of compensatory education"? Ten issues dealing with the evaluation of Title I were identified within a general theoretical framework of evaluation. For each issue it was the aim of this document to do the following: 1)to clarify the…

  20. Educational Approaches to Entrepreneurship in Higher Education: A View from the Swedish Horizon

    ERIC Educational Resources Information Center

    Hoppe, Magnus; Westerberg, Mats; Leffler, Eva

    2017-01-01

    Purpose: The purpose of this paper is to present and develop models of educational approaches to entrepreneurship that can provide complementary analytical structures to better study, enact and reflect upon the role of entrepreneurship in higher education. Design/methodology/approach A general framework for entrepreneurship education is developed…

  1. Mathematics Lectures as Narratives: Insights from Network Graph Methodology

    ERIC Educational Resources Information Center

    Weinberg, Aaron; Wiesner, Emilie; Fukawa-Connelly, Tim

    2016-01-01

    Although lecture is the traditional method of university mathematics instruction, there has been little empirical research that describes the general structure of lectures. In this paper, we adapt ideas from narrative analysis and apply them to an upper-level mathematics lecture. We develop a framework that enables us to conceptualize the lecture…

  2. Language Interdependence between American Sign Language and English: A Review of Empirical Studies

    ERIC Educational Resources Information Center

    Rusher, Melissa Ausbrooks

    2012-01-01

    This study provides a contemporary definition of American Sign Language/English bilingual education (AEBE) and outlines an essential theoretical framework. Included is a history and evolution of the methodology. The author also summarizes the general findings of twenty-six (26) empirical studies conducted in the United States that directly or…

  3. A neural network based methodology to predict site-specific spectral acceleration values

    NASA Astrophysics Data System (ADS)

    Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.

    2010-12-01

    A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.

  4. Exploring consensus in 21st century projections of climatically suitable areas for African vertebrates

    PubMed Central

    Garcia, Raquel A; Burgess, Neil D; Cabeza, Mar; Rahbek, Carsten; Araújo, Miguel B

    2012-01-01

    Africa is predicted to be highly vulnerable to 21st century climatic changes. Assessing the impacts of these changes on Africa's biodiversity is, however, plagued by uncertainties, and markedly different results can be obtained from alternative bioclimatic envelope models or future climate projections. Using an ensemble forecasting framework, we examine projections of future shifts in climatic suitability, and their methodological uncertainties, for over 2500 species of mammals, birds, amphibians and snakes in sub-Saharan Africa. To summarize a priori the variability in the ensemble of 17 general circulation models, we introduce a consensus methodology that combines co-varying models. Thus, we quantify and map the relative contribution to uncertainty of seven bioclimatic envelope models, three multi-model climate projections and three emissions scenarios, and explore the resulting variability in species turnover estimates. We show that bioclimatic envelope models contribute most to variability, particularly in projected novel climatic conditions over Sahelian and southern Saharan Africa. To summarize agreements among projections from the bioclimatic envelope models we compare five consensus methodologies, which generally increase or retain projection accuracy and provide consistent estimates of species turnover. Variability from emissions scenarios increases towards late-century and affects southern regions of high species turnover centred in arid Namibia. Twofold differences in median species turnover across the study area emerge among alternative climate projections and emissions scenarios. Our ensemble of projections underscores the potential bias when using a single algorithm or climate projection for Africa, and provides a cautious first approximation of the potential exposure of sub-Saharan African vertebrates to climatic changes. The future use and further development of bioclimatic envelope modelling will hinge on the interpretation of results in the light of methodological as well as biological uncertainties. Here, we provide a framework to address methodological uncertainties and contextualize results.

  5. GAMES II Project: a general architecture for medical knowledge-based systems.

    PubMed

    Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M

    1994-10-01

    GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.

  6. e-IQ and IQ knowledge mining for generalized LDA

    NASA Astrophysics Data System (ADS)

    Jenkins, Jeffrey; van Bergem, Rutger; Sweet, Charles; Vietsch, Eveline; Szu, Harold

    2015-05-01

    How can the human brain uncover patterns, associations and features in real-time, real-world data? There must be a general strategy used to transform raw signals into useful features, but representing this generalization in the context of our information extraction tool set is lacking. In contrast to Big Data (BD), Large Data Analysis (LDA) has become a reachable multi-disciplinary goal in recent years due in part to high performance computers and algorithm development, as well as the availability of large data sets. However, the experience of Machine Learning (ML) and information communities has not been generalized into an intuitive framework that is useful to researchers across disciplines. The data exploration phase of data mining is a prime example of this unspoken, ad-hoc nature of ML - the Computer Scientist works with a Subject Matter Expert (SME) to understand the data, and then build tools (i.e. classifiers, etc.) which can benefit the SME and the rest of the researchers in that field. We ask, why is there not a tool to represent information in a meaningful way to the researcher asking the question? Meaning is subjective and contextual across disciplines, so to ensure robustness, we draw examples from several disciplines and propose a generalized LDA framework for independent data understanding of heterogeneous sources which contribute to Knowledge Discovery in Databases (KDD). Then, we explore the concept of adaptive Information resolution through a 6W unsupervised learning methodology feedback system. In this paper, we will describe the general process of man-machine interaction in terms of an asymmetric directed graph theory (digging for embedded knowledge), and model the inverse machine-man feedback (digging for tacit knowledge) as an ANN unsupervised learning methodology. Finally, we propose a collective learning framework which utilizes a 6W semantic topology to organize heterogeneous knowledge and diffuse information to entities within a society in a personalized way.

  7. Clarifying the Conceptualization, Dimensionality, and Structure of Emotion: Response to Barrett and Colleagues.

    PubMed

    Cowen, Alan S; Keltner, Dacher

    2018-04-01

    We present a mathematically based framework distinguishing the dimensionality, structure, and conceptualization of emotion-related responses. Our recent findings indicate that reported emotional experience is high-dimensional, involves gradients between categories traditionally thought of as discrete (e.g., 'fear', 'disgust'), and cannot be reduced to widely used domain-general scales (valence, arousal, etc.). In light of our conceptual framework and findings, we address potential methodological and conceptual confusions in Barrett and colleagues' commentary on our work. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    PubMed Central

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-01-01

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the clinician report measures appeared less well developed. It would be of value if new measures defined the construct of interest and, that the construct, be part of theoretical model. By ensuring measures are both theoretically and empirically valid then improvements in subjective health outcome measures should be possible. PMID:17343739

  9. Argumentation in Science Education: A Model-based Framework

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2011-02-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.

  10. A Function-Behavior-State Approach to Designing Human Machine Interface for Nuclear Power Plant Operators

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Zhang, W. J.

    2005-02-01

    This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.

  11. Ged® Completers' Perceptions of College Readiness and Social Capital: Linking Adult Literacy to a Greater Quality of Life

    ERIC Educational Resources Information Center

    Lott, Donalyn; O'Dell, Jade

    2014-01-01

    This study examined the efficacy of general education development (GED®) acquisition and GED® completers' perceptions of college readiness and social capital using a quantitative methodology. Also, the study used a descriptive, cross-sectional research design framed by the social capital theoretical perspective. The conceptual framework developed…

  12. Implementing Response to Intervention in Title I Elementary Schools: A Quantitative Study of Teacher Response Relationships

    ERIC Educational Resources Information Center

    Webster, Katina F.

    2012-01-01

    General educators and special educators in Title I elementary schools perceive the relationships between principles of RTI and their state RTI framework, the implementation of RTI, and professional development received in RTI differently. A quantitative survey-based research methodology was employed including the use of Cronbach's alpha to…

  13. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  14. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  15. Towards a general object-oriented software development methodology

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.

  16. Stakeholder management for conservation projects: a case study of Ream National Park, Cambodia.

    PubMed

    De Lopez, T T

    2001-07-01

    The paper gives an account of the development and implementation of a stakeholder management framework at Ream National Park, Cambodia. Firstly, the concept of stakeholder is reviewed in management and in conservation literatures. Secondly, the context in which the stakeholder framework was implemented is described. Thirdly, a five-step methodological framework is suggested: (1) stakeholder analysis, (2) stakeholder mapping, (3) development of generic strategies and workplan, (4) presentation of the workplan to stakeholders, and (5) implementation of the workplan. This framework classifies stakeholders according to their level of influence on the project and their potential for the conservation of natural resources. In a situation characterized by conflicting claims on natural resources, park authorities were able to successfully develop specific strategies for the management of stakeholders. The conclusion discusses the implications of the Ream experience and the generalization of the framework to other protected areas.

  17. Assessing the impact of modeling limits on intelligent systems

    NASA Technical Reports Server (NTRS)

    Rouse, William B.; Hammer, John M.

    1990-01-01

    The knowledge bases underlying intelligent systems are validated. A general conceptual framework is provided for considering the roles in intelligent systems of models of physical, behavioral, and operational phenomena. A methodology is described for identifying limits in particular intelligent systems, and the use of the methodology is illustrated via an experimental evaluation of the pilot-vehicle interface within the Pilot's Associate. The requirements and functionality are outlined for a computer based knowledge engineering environment which would embody the approach advocated and illustrated in earlier discussions. Issues considered include the specific benefits of this functionality, the potential breadth of applicability, and technical feasibility.

  18. Validation Methods Research for Fault-Tolerant Avionics and Control Systems: Working Group Meeting, 2

    NASA Technical Reports Server (NTRS)

    Gault, J. W. (Editor); Trivedi, K. S. (Editor); Clary, J. B. (Editor)

    1980-01-01

    The validation process comprises the activities required to insure the agreement of system realization with system specification. A preliminary validation methodology for fault tolerant systems documented. A general framework for a validation methodology is presented along with a set of specific tasks intended for the validation of two specimen system, SIFT and FTMP. Two major areas of research are identified. First, are those activities required to support the ongoing development of the validation process itself, and second, are those activities required to support the design, development, and understanding of fault tolerant systems.

  19. An Open IMS-Based User Modelling Approach for Developing Adaptive Learning Management Systems

    ERIC Educational Resources Information Center

    Boticario, Jesus G.; Santos, Olga C.

    2007-01-01

    Adaptive LMS have not yet reached the eLearning marketplace due to methodological, technological and management open issues. At aDeNu group, we have been working on two key challenges for the last five years in related research projects. Firstly, develop the general framework and a running architecture to support the adaptive life cycle (i.e.,…

  20. Computational Approaches for Analyzing Tradeoffs between Training and Aiding. Final Technical Paper for Period February-December 1989.

    ERIC Educational Resources Information Center

    Rouse, William B.; Johnson, William B.

    A methodological framework is presented for representing tradeoffs among alternative combinations of training and aiding for personnel in complex situations. In general, more highly trained people need less aid, and those with less training need more aid. Balancing training and aiding to accomplish the objectives of the system in a cost effective…

  1. Is There a European View on Health Economic Evaluations? Results from a Synopsis of Methodological Guidelines Used in the EUnetHTA Partner Countries.

    PubMed

    Heintz, Emelie; Gerber-Grote, Andreas; Ghabri, Salah; Hamers, Francoise F; Rupel, Valentina Prevolnik; Slabe-Erker, Renata; Davidson, Thomas

    2016-01-01

    The objectives of this study were to review current methodological guidelines for economic evaluations of all types of technologies in the 33 countries with organizations involved in the European Network for Health Technology Assessment (EUnetHTA), and to provide a general framework for economic evaluation at a European level. Methodological guidelines for health economic evaluations used by EUnetHTA partners were collected through a survey. Information from each guideline was extracted using a pre-tested extraction template. On the basis of the extracted information, a summary describing the methods used by the EUnetHTA countries was written for each methodological item. General recommendations were formulated for methodological issues where the guidelines of the EUnetHTA partners were in agreement or where the usefulness of economic evaluations may be increased by presenting the results in a specific way. At least one contact person from all 33 EUnetHTA countries (100 %) responded to the survey. In total, the review included 51 guidelines, representing 25 countries (eight countries had no methodological guideline for health economic evaluations). On the basis of the results of the extracted information from all 51 guidelines, EUnetHTA issued ten main recommendations for health economic evaluations. The presented review of methodological guidelines for health economic evaluations and the consequent recommendations will hopefully improve the comparability, transferability and overall usefulness of economic evaluations performed within EUnetHTA. Nevertheless, there are still methodological issues that need to be investigated further.

  2. A generalized plate method for estimating total aerobic microbial count.

    PubMed

    Ho, Kai Fai

    2004-01-01

    The plate method outlined in Chapter 61: Microbial Limit Tests of the U.S. Pharmacopeia (USP 61) provides very specific guidance for assessing total aerobic bioburden in pharmaceutical articles. This methodology, while comprehensive, lacks the flexibility to be useful in all situations. By studying the plate method as a special case within a more general family of assays, the effects of each parameter in the guidance can be understood. Using a mathematical model to describe the plate counting procedure, a statistical framework for making more definitive statements about total aerobic bioburden is developed. Such a framework allows the laboratory scientist to adjust the USP 61 methods to satisfy specific practical constraints. In particular, it is shown that the plate method can be conducted, albeit with stricter acceptance criteria, using a test specimen quantity that is smaller than the 10 g or 10 mL prescribed in the guidance. Finally, the interpretation of results proffered by the guidance is re-examined within this statistical framework and shown to be overly aggressive.

  3. Steps in creating a methodology for interpreting a geodiversity element -integrating a geodiversity element in the popular knowledge

    NASA Astrophysics Data System (ADS)

    Toma, Cristina; Andrasanu, Alexandru

    2017-04-01

    Conserving geodiversity and especially geological heritage is not very well integrated in the general knowledge as biodiversity is, for example. Keeping that in mind we are trying, through this research, to find a better way of transmitting a geological process to the general public. The means to integrate a geodiversity element in the popular knowledge is through interpretation. Interpretation "translates" the scientific information into a common language with very well known facts by the general public. The purpose of this paper is creating a framework for a methodology necessary in interpreting a geodiversity element - salt - in Buzau Land Geopark. We will approach the salt subject through a scheme in order to have a general view of the process and to better understand and explain it to the general public. We will look into the subject from three scientific points of view: GEODIVERSITY, ANTHROPOLOGY, and the SOCIO-ECONOMICAL aspect. Each of these points of view or domains will be divided into themes. For GEODIVERSITY we will have the following themes: Formation, Accumulation, Diapirism process, Chemical formula, Landscape (here we will include also the specific biodiversity with the halophile plants), Landforms, Hazard. For ANTHROPOLOGY will contain themes of tangible and intangible heritage like: Salt symbolistic, Stories and ritual usage, Recipes, How the knowledge is transmitted. The SOCIO-ECONOMIC aspect will be reflected through themes like: Extractive methods, Usage, Interdictions, Taxes, Commercial exchanges. Each theme will have a set of keywords that will be described and each one will be at the base of the elements that together will form the interpretation of the geodiversity element - the salt. The next step will be to clearly set the scope of the interpretation, to which field of expertise is our interpretation process addressed: Education (Undergraduate or post-graduate Students), Science, Geotourism, Entrepreneurship. After putting together the elements derived from the key words, and establishing the purpose of the interpretation, the following step will be finding the message to be sent through interpretation. The last step of the framework will be finding the proper means to transmit the interpretive message: panels, installations, geo-routes, visitors centers, landart, virtual/augmented reality. This framework would represent a methodology to be followed when interpreting scientific knowledge about a geological process. Thus, this approach - the geodiversity reflected through the anthropological and socio-economic aspects- would be a successful method for showing the general public how a geological element influenced their lives, drawing them closer to Earth Sciences.

  4. A general science-based framework for dynamical spatio-temporal models

    USGS Publications Warehouse

    Wikle, C.K.; Hooten, M.B.

    2010-01-01

    Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic nonlinearity and demonstrate that it accommodates many different classes of scientific-based parameterizations as special cases. The model is presented in a hierarchical Bayesian framework and is illustrated with examples from ecology and oceanography. ?? 2010 Sociedad de Estad??stica e Investigaci??n Operativa.

  5. The relationship between symbolic interactionism and interpretive description.

    PubMed

    Oliver, Carolyn

    2012-03-01

    In this article I explore the relationship between symbolic interactionist theory and interpretive description methodology. The two are highly compatible, making symbolic interactionism an excellent theoretical framework for interpretive description studies. The pragmatism underlying interpretive description supports locating the methodology within this cross-disciplinary theory to make it more attractive to nonnursing researchers and expand its potential to address practice problems across the applied disciplines. The theory and method are so compatible that symbolic interactionism appears to be part of interpretive description's epistemological foundations. Interpretive description's theoretical roots have, to date, been identified only very generally in interpretivism and the philosophy of nursing. A more detailed examination of its symbolic interactionist heritage furthers the contextualization or forestructuring of the methodology to meet one of its own requirements for credibility.

  6. High-Order Hyperbolic Residual-Distribution Schemes on Arbitrary Triangular Grids

    DTIC Science & Technology

    2015-06-22

    Galerkin methodology formulated in the framework of the residual-distribution method. For both second- and third- 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...construct these schemes based on the Low-Diffusion-A and the Streamwise-Upwind-Petrov-Galerkin methodology formulated in the framework of the residual...methodology formulated in the framework of the residual-distribution method. For both second- and third-order-schemes, we construct a fully implicit

  7. A methodology to model causal relationships on offshore safety assessment focusing on human and organizational factors.

    PubMed

    Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B

    2008-01-01

    Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.

  8. Diagnostic radiograph based 3D bone reconstruction framework: application to the femur.

    PubMed

    Gamage, P; Xie, S Q; Delmas, P; Xu, W L

    2011-09-01

    Three dimensional (3D) visualization of anatomy plays an important role in image guided orthopedic surgery and ultimately motivates minimally invasive procedures. However, direct 3D imaging modalities such as Computed Tomography (CT) are restricted to a minority of complex orthopedic procedures. Thus the diagnostics and planning of many interventions still rely on two dimensional (2D) radiographic images, where the surgeon has to mentally visualize the anatomy of interest. The purpose of this paper is to apply and validate a bi-planar 3D reconstruction methodology driven by prominent bony anatomy edges and contours identified on orthogonal radiographs. The results obtained through the proposed methodology are benchmarked against 3D CT scan data to assess the accuracy of reconstruction. The human femur has been used as the anatomy of interest throughout the paper. The novelty of this methodology is that it not only involves the outer contours of the bony anatomy in the reconstruction but also several key interior edges identifiable on radiographic images. Hence, this framework is not simply limited to long bones, but is generally applicable to a multitude of other bony anatomies as illustrated in the results section. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Mapping Chemical Selection Pathways for Designing Multicomponent Alloys: an informatics framework for materials design.

    PubMed

    Srinivasan, Srikant; Broderick, Scott R; Zhang, Ruifeng; Mishra, Amrita; Sinnott, Susan B; Saxena, Surendra K; LeBeau, James M; Rajan, Krishna

    2015-12-18

    A data driven methodology is developed for tracking the collective influence of the multiple attributes of alloying elements on both thermodynamic and mechanical properties of metal alloys. Cobalt-based superalloys are used as a template to demonstrate the approach. By mapping the high dimensional nature of the systematics of elemental data embedded in the periodic table into the form of a network graph, one can guide targeted first principles calculations that identify the influence of specific elements on phase stability, crystal structure and elastic properties. This provides a fundamentally new means to rapidly identify new stable alloy chemistries with enhanced high temperature properties. The resulting visualization scheme exhibits the grouping and proximity of elements based on their impact on the properties of intermetallic alloys. Unlike the periodic table however, the distance between neighboring elements uncovers relationships in a complex high dimensional information space that would not have been easily seen otherwise. The predictions of the methodology are found to be consistent with reported experimental and theoretical studies. The informatics based methodology presented in this study can be generalized to a framework for data analysis and knowledge discovery that can be applied to many material systems and recreated for different design objectives.

  10. Fully Associative, Nonisothermal, Potential-Based Unified Viscoplastic Model for Titanium-Based Matrices

    NASA Technical Reports Server (NTRS)

    2005-01-01

    A number of titanium matrix composite (TMC) systems are currently being investigated for high-temperature air frame and propulsion system applications. As a result, numerous computational methodologies for predicting both deformation and life for this class of materials are under development. An integral part of these methodologies is an accurate and computationally efficient constitutive model for the metallic matrix constituent. Furthermore, because these systems are designed to operate at elevated temperatures, the required constitutive models must account for both time-dependent and time-independent deformations. To accomplish this, the NASA Lewis Research Center is employing a recently developed, complete, potential-based framework. This framework, which utilizes internal state variables, was put forth for the derivation of reversible and irreversible constitutive equations. The framework, and consequently the resulting constitutive model, is termed complete because the existence of the total (integrated) form of the Gibbs complementary free energy and complementary dissipation potentials are assumed a priori. The specific forms selected here for both the Gibbs and complementary dissipation potentials result in a fully associative, multiaxial, nonisothermal, unified viscoplastic model with nonlinear kinematic hardening. This model constitutes one of many models in the Generalized Viscoplasticity with Potential Structure (GVIPS) class of inelastic constitutive equations.

  11. A Patient-Centered Framework for Evaluating Digital Maturity of Health Services: A Systematic Review

    PubMed Central

    Callahan, Ryan; Darzi, Ara; Mayer, Erik

    2016-01-01

    Background Digital maturity is the extent to which digital technologies are used as enablers to deliver a high-quality health service. Extensive literature exists about how to assess the components of digital maturity, but it has not been used to design a comprehensive framework for evaluation. Consequently, the measurement systems that do exist are limited to evaluating digital programs within one service or care setting, meaning that digital maturity evaluation is not accounting for the needs of patients across their care pathways. Objective The objective of our study was to identify the best methods and metrics for evaluating digital maturity and to create a novel, evidence-based tool for evaluating digital maturity across patient care pathways. Methods We systematically reviewed the literature to find the best methods and metrics for evaluating digital maturity. We searched the PubMed database for all papers relevant to digital maturity evaluation. Papers were selected if they provided insight into how to appraise digital systems within the health service and if they indicated the factors that constitute or facilitate digital maturity. Papers were analyzed to identify methodology for evaluating digital maturity and indicators of digitally mature systems. We then used the resulting information about methodology to design an evaluation framework. Following that, the indicators of digital maturity were extracted and grouped into increasing levels of maturity and operationalized as metrics within the evaluation framework. Results We identified 28 papers as relevant to evaluating digital maturity, from which we derived 5 themes. The first theme concerned general evaluation methodology for constructing the framework (7 papers). The following 4 themes were the increasing levels of digital maturity: resources and ability (6 papers), usage (7 papers), interoperability (3 papers), and impact (5 papers). The framework includes metrics for each of these levels at each stage of the typical patient care pathway. Conclusions The framework uses a patient-centric model that departs from traditional service-specific measurements and allows for novel insights into how digital programs benefit patients across the health system. Trial Registration N/A PMID:27080852

  12. Spatial operator algebra framework for multibody system dynamics

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Jain, Abhinandan; Kreutz, K.

    1989-01-01

    The Spatial Operator Algebra framework for the dynamics of general multibody systems is described. The use of a spatial operator-based methodology permits the formulation of the dynamical equations of motion of multibody systems in a concise and systematic way. The dynamical equations of progressively more complex grid multibody systems are developed in an evolutionary manner beginning with a serial chain system, followed by a tree topology system and finally, systems with arbitrary closed loops. Operator factorizations and identities are used to develop novel recursive algorithms for the forward dynamics of systems with closed loops. Extensions required to deal with flexible elements are also discussed.

  13. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    NASA Astrophysics Data System (ADS)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  14. Hermeneutics as a Methodological Resource for Understanding Empathy in On-Line Learning Environments

    ERIC Educational Resources Information Center

    Walshaw, Margaret; Duncan, Wayne

    2015-01-01

    Hermeneutics is both a philosophical tradition and a methodological resource. In this qualitative study, hermeneutics provided, simultaneously, a framework and a methodology for understanding empathy in synchronous multimedia conferencing. As a framework for the design of the study, hermeneutics supported the overriding objective to understand the…

  15. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    PubMed

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information gathered in this study to adapt an existing framework for impact of clinical research for use in methodological research. Gathering evidence on research impact of methodological research from a variety of sources has enabled us to obtain multiple indicators and thus to demonstrate broad impacts of methodological research. The adapted framework developed can be applied to future methodological research and thus provides a tool for methodologists to better assess and report research impacts.

  16. Alternatives Assessment Frameworks: Research Needs for the Informed Substitution of Hazardous Chemicals

    PubMed Central

    Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally

    2015-01-01

    Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778

  17. Towards a Model of Technology Adoption: A Conceptual Model Proposition

    NASA Astrophysics Data System (ADS)

    Costello, Pat; Moreton, Rob

    A conceptual model for Information Communication Technology (ICT) adoption by Small Medium Enterprises (SMEs) is proposed. The research uses several ICT adoption models as its basis with theoretical underpinning provided by the Diffusion of Innovation theory and the Technology Acceptance Model (TAM). Taking an exploratory research approach the model was investigated amongst 200 SMEs whose core business is ICT. Evidence from this study demonstrates that these SMEs face the same issues as all other industry sectors. This work points out weaknesses in SMEs environments regarding ICT adoption and suggests what they may need to do to increase the success rate of any proposed adoption. The methodology for development of the framework is described and recommendations made for improved Government-led ICT adoption initiatives. Application of the general methodology has resulted in new opportunities to embed the ethos and culture surrounding the issues into the framework of new projects developed as a result of Government intervention. A conceptual model is proposed that may lead to a deeper understanding of the issues under consideration.

  18. Evaluating multiple determinants of the structure of plant-animal mutualistic networks.

    PubMed

    Vázquez, Diego P; Chacoff, Natacha P; Cagnolo, Luciano

    2009-08-01

    The structure of mutualistic networks is likely to result from the simultaneous influence of neutrality and the constraints imposed by complementarity in species phenotypes, phenologies, spatial distributions, phylogenetic relationships, and sampling artifacts. We develop a conceptual and methodological framework to evaluate the relative contributions of these potential determinants. Applying this approach to the analysis of a plant-pollinator network, we show that information on relative abundance and phenology suffices to predict several aggregate network properties (connectance, nestedness, interaction evenness, and interaction asymmetry). However, such information falls short of predicting the detailed network structure (the frequency of pairwise interactions), leaving a large amount of variation unexplained. Taken together, our results suggest that both relative species abundance and complementarity in spatiotemporal distribution contribute substantially to generate observed network patters, but that this information is by no means sufficient to predict the occurrence and frequency of pairwise interactions. Future studies could use our methodological framework to evaluate the generality of our findings in a representative sample of study systems with contrasting ecological conditions.

  19. A dynamic modelling framework towards the solution of reduction in smoking prevalence

    NASA Astrophysics Data System (ADS)

    Halim, Tisya Farida Abdul; Sapiri, Hasimah; Abidin, Norhaslinda Zainal

    2016-10-01

    This paper presents a hypothetical framework towards the solution for reduction in smoking prevalence in Malaysia. The framework is design to assist in decision making process related to reduction in smoking prevalence using SD and OCT. In general, this framework is developed using SD approach where OCT is embedded in the policy evaluation process. Smoking prevalence is one of the determinant which plays an important role in measuring a successful implementation of anti-smoking strategies. Therefore, it is critical to determine the optimal value of smoking prevalence in order to trim down the hazardous effects of smoking to society. Conversely, smoking problem becomes increasingly complex since many issues that ranged from behavioral to economical need to be considered simultaneously. Thus, a hypothetical framework of the control model embedded in the SD methodology is expected to obtain the minimum value of smoking prevalence which the output in turn will provide a guideline for tobacco researchers as well as decision makers for policy design and evaluation.

  20. Data mining in soft computing framework: a survey.

    PubMed

    Mitra, S; Pal, S K; Mitra, P

    2002-01-01

    The present article provides a survey of the available literature on data mining using soft computing. A categorization has been provided based on the different soft computing tools and their hybridizations used, the data mining function implemented, and the preference criterion selected by the model. The utility of the different soft computing methodologies is highlighted. Generally fuzzy sets are suitable for handling the issues related to understandability of patterns, incomplete/noisy data, mixed media information and human interaction, and can provide approximate solutions faster. Neural networks are nonparametric, robust, and exhibit good learning and generalization capabilities in data-rich environments. Genetic algorithms provide efficient search algorithms to select a model, from mixed media data, based on some preference criterion/objective function. Rough sets are suitable for handling different types of uncertainty in data. Some challenges to data mining and the application of soft computing methodologies are indicated. An extensive bibliography is also included.

  1. A Unified Methodology for Computing Accurate Quaternion Color Moments and Moment Invariants.

    PubMed

    Karakasis, Evangelos G; Papakostas, George A; Koulouriotis, Dimitrios E; Tourassis, Vassilios D

    2014-02-01

    In this paper, a general framework for computing accurate quaternion color moments and their corresponding invariants is proposed. The proposed unified scheme arose by studying the characteristics of different orthogonal polynomials. These polynomials are used as kernels in order to form moments, the invariants of which can easily be derived. The resulted scheme permits the usage of any polynomial-like kernel in a unified and consistent way. The resulted moments and moment invariants demonstrate robustness to noisy conditions and high discriminative power. Additionally, in the case of continuous moments, accurate computations take place to avoid approximation errors. Based on this general methodology, the quaternion Tchebichef, Krawtchouk, Dual Hahn, Legendre, orthogonal Fourier-Mellin, pseudo Zernike and Zernike color moments, and their corresponding invariants are introduced. A selected paradigm presents the reconstruction capability of each moment family, whereas proper classification scenarios evaluate the performance of color moment invariants.

  2. Statistical Inference for Data Adaptive Target Parameters.

    PubMed

    Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J

    2016-05-01

    Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.

  3. A Framework and a Methodology for Developing Authentic Constructivist e-Learning Environments

    ERIC Educational Resources Information Center

    Zualkernan, Imran A.

    2006-01-01

    Semantically rich domains require operative knowledge to solve complex problems in real-world settings. These domains provide an ideal environment for developing authentic constructivist e-learning environments. In this paper we present a framework and a methodology for developing authentic learning environments for such domains. The framework is…

  4. Where does good quality qualitative health care research get published?

    PubMed

    Richardson, Jane C; Liddle, Jennifer

    2017-09-01

    This short report aims to give some insight into current publication patterns for high-quality qualitative health research, using the Research Excellence Framework (REF) 2014 database. We explored patterns of publication by range and type of journal, by date and by methodological focus. We also looked at variations between the publications submitted to different Units of Assessment, focussing particularly on the one most closely aligned with our own research area of primary care. Our brief analysis demonstrates that general medical/health journals with high impact factors are the dominant routes of publication, but there is variation according to the methodological approach adopted by articles. The number of qualitative health articles submitted to REF 2014 overall was small, and even more so for articles based on mixed methods research, qualitative methodology or reviews/syntheses that included qualitative articles.

  5. Epidemiology Characteristics, Methodological Assessment and Reporting of Statistical Analysis of Network Meta-Analyses in the Field of Cancer

    PubMed Central

    Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu

    2016-01-01

    Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997

  6. Causal Analysis After Haavelmo

    PubMed Central

    Heckman, James; Pinto, Rodrigo

    2014-01-01

    Haavelmo's seminal 1943 and 1944 papers are the first rigorous treatment of causality. In them, he distinguished the definition of causal parameters from their identification. He showed that causal parameters are defined using hypothetical models that assign variation to some of the inputs determining outcomes while holding all other inputs fixed. He thus formalized and made operational Marshall's (1890) ceteris paribus analysis. We embed Haavelmo's framework into the recursive framework of Directed Acyclic Graphs (DAGs) used in one influential recent approach to causality (Pearl, 2000) and in the related literature on Bayesian nets (Lauritzen, 1996). We compare the simplicity of an analysis of causality based on Haavelmo's methodology with the complex and nonintuitive approach used in the causal literature of DAGs—the “do-calculus” of Pearl (2009). We discuss the severe limitations of DAGs and in particular of the do-calculus of Pearl in securing identification of economic models. We extend our framework to consider models for simultaneous causality, a central contribution of Haavelmo. In general cases, DAGs cannot be used to analyze models for simultaneous causality, but Haavelmo's approach naturally generalizes to cover them. PMID:25729123

  7. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis of any high-risk operations, in not only the oil and gas industry but also in other industries such as nuclear power plants, aviation industry, and transportation sector.

  8. Research in assessment: consensus statement and recommendations from the Ottawa 2010 Conference.

    PubMed

    Schuwirth, Lambert; Colliver, Jerry; Gruppen, Larry; Kreiter, Clarence; Mennin, Stewart; Onishi, Hirotaka; Pangaro, Louis; Ringsted, Charlotte; Swanson, David; Van Der Vleuten, Cees; Wagner-Menghin, Michaela

    2011-01-01

    Medical education research in general is a young scientific discipline which is still finding its own position in the scientific range. It is rooted in both the biomedical sciences and the social sciences, each with their own scientific language. A more unique feature of medical education (and assessment) research is that it has to be both locally and internationally relevant. This is not always easy and sometimes leads to purely ideographic descriptions of an assessment procedure with insufficient general lessons or generalised scientific knowledge being generated or vice versa. For medical educational research, a plethora of methodologies is available to cater to many different research questions. This article contains consensus positions and suggestions on various elements of medical education (assessment) research. Overarching is the position that without a good theoretical underpinning and good knowledge of the existing literature, good research and sound conclusions are impossible to produce, and that there is no inherently superior methodology, but that the best methodology is the one most suited to answer the research question unambiguously. Although the positions should not be perceived as dogmas, they should be taken as very serious recommendations. Topics covered are: types of research, theoretical frameworks, designs and methodologies, instrument properties or psychometrics, costs/acceptability, ethics, infrastructure and support.

  9. Assessing the Benefits and Costs of Motion for C-17 Flight Simulators: Technical Appendixes.

    DTIC Science & Technology

    1986-06-01

    Conference, NAECON, 1983. 4’ U-. - 182 - Instructional System Development, AF Manual 50-2, USAF, May 25, 1979. Irish , P.A., and G.H. Buckland, "Effects of...control augmentation system ; (4) the fidelity of different siirulator motion cueing alternatives; (5) a suggested methodology for assessinq the...evaluating the benefits and costs of incorporating motion systems in C-17 transport aircraft flight simulators and in developing a general framework

  10. Differences in Learning Style Preferences, Environmental Press Perceptions and Job Satisfaction between Surgical Intensive Care and General Surgical Unit Nurses

    DTIC Science & Technology

    1991-01-01

    well-defined theoretical basis (Bonham, 1988; Kirby, 1979). Although Kolb (1984) incorporated the works of Kurt Lewin , Jean Piaget and Carl Jung, John ...styles influence how individuals assimilate environmental demands. Kolb’s (1984) Experiential Learning Theory, whose roots are traced to John Dewey , Kurt ... Lewin , and Jean Piaget , provides a methodological framework to understand and strengthen the relationshios between education, personal development

  11. Application of Executable Architecture in Early Concept Evaluation using the DoD Architecture Framework

    DTIC Science & Technology

    2016-09-15

    7 Methodology Overview ................................................................................................7...32 III. Methodology ...33 Overview of Research Methodology ..........................................................................34 Implementation of Methodology

  12. A Methodological Framework for Enterprise Information System Requirements Derivation

    NASA Astrophysics Data System (ADS)

    Caplinskas, Albertas; Paškevičiūtė, Lina

    Current information systems (IS) are enterprise-wide systems supporting strategic goals of the enterprise and meeting its operational business needs. They are supported by information and communication technologies (ICT) and other software that should be fully integrated. To develop software responding to real business needs, we need requirements engineering (RE) methodology that ensures the alignment of requirements for all levels of enterprise system. The main contribution of this chapter is a requirement-oriented methodological framework allowing to transform business requirements level by level into software ones. The structure of the proposed framework reflects the structure of Zachman's framework. However, it has other intentions and is purposed to support not the design but the RE issues.

  13. MPHASYS: a mouse phenotype analysis system

    PubMed Central

    Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan

    2007-01-01

    Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167

  14. Health information systems: a survey of frameworks for developing countries.

    PubMed

    Marcelo, A B

    2010-01-01

    The objective of this paper is to perform a survey of excellent research on health information systems (HIS) analysis and design, and their underlying theoretical frameworks. It classifies these frameworks along major themes, and analyzes the different approaches to HIS development that are practical in resource-constrained environments. Literature review based on PubMed citations and conference proceedings, as well as Internet searches on information systems in general, and health information systems in particular. The field of health information systems development has been studied extensively. Despite this, failed implementations are still common. Theoretical frameworks for HIS development are available that can guide implementers. As awareness, acceptance, and demand for health information systems increase globally, the variety of approaches and strategies will also follow. For developing countries with scarce resources, a trial-and-error approach can be very costly. Lessons from the successes and failures of initial HIS implementations have been abstracted into theoretical frameworks. These frameworks organize complex HIS concepts into methodologies that standardize techniques in implementation. As globalization continues to impact healthcare in the developing world, demand for more responsive health systems will become urgent. More comprehensive frameworks and practical tools to guide HIS implementers will be imperative.

  15. Efficient particle-in-cell simulation of auroral plasma phenomena using a CUDA enabled graphics processing unit

    NASA Astrophysics Data System (ADS)

    Sewell, Stephen

    This thesis introduces a software framework that effectively utilizes low-cost commercially available Graphic Processing Units (GPUs) to simulate complex scientific plasma phenomena that are modeled using the Particle-In-Cell (PIC) paradigm. The software framework that was developed conforms to the Compute Unified Device Architecture (CUDA), a standard for general purpose graphic processing that was introduced by NVIDIA Corporation. This framework has been verified for correctness and applied to advance the state of understanding of the electromagnetic aspects of the development of the Aurora Borealis and Aurora Australis. For each phase of the PIC methodology, this research has identified one or more methods to exploit the problem's natural parallelism and effectively map it for execution on the graphic processing unit and its host processor. The sources of overhead that can reduce the effectiveness of parallelization for each of these methods have also been identified. One of the novel aspects of this research was the utilization of particle sorting during the grid interpolation phase. The final representation resulted in simulations that executed about 38 times faster than simulations that were run on a single-core general-purpose processing system. The scalability of this framework to larger problem sizes and future generation systems has also been investigated.

  16. Evaluation of stormwater harvesting sites using multi criteria decision methodology

    NASA Astrophysics Data System (ADS)

    Inamdar, P. M.; Sharma, A. K.; Cook, Stephen; Perera, B. J. C.

    2018-07-01

    Selection of suitable urban stormwater harvesting sites and associated project planning are often complex due to spatial, temporal, economic, environmental and social factors, and related various other variables. This paper is aimed at developing a comprehensive methodology framework for evaluating of stormwater harvesting sites in urban areas using Multi Criteria Decision Analysis (MCDA). At the first phase, framework selects potential stormwater harvesting (SWH) sites using spatial characteristics in a GIS environment. In second phase, MCDA methodology is used for evaluating and ranking of SWH sites in multi-objective and multi-stakeholder environment. The paper briefly describes first phase of framework and focuses chiefly on the second phase of framework. The application of the methodology is also demonstrated over a case study comprising of the local government area, City of Melbourne (CoM), Australia for the benefit of wider water professionals engaged in this area. Nine performance measures (PMs) were identified to characterise the objectives and system performance related to the eight alternative SWH sites for the demonstration of the application of developed methodology. To reflect the stakeholder interests in the current study, four stakeholder participant groups were identified, namely, water authorities (WA), academics (AC), consultants (CS), and councils (CL). The decision analysis methodology broadly consisted of deriving PROMETHEE II rankings of eight alternative SWH sites in the CoM case study, under two distinct group decision making scenarios. The major innovation of this work is the development and application of comprehensive methodology framework that assists in the selection of potential sites for SWH, and facilitates the ranking in multi-objective and multi-stakeholder environment. It is expected that the proposed methodology will assist the water professionals and managers with better knowledge that will reduce the subjectivity in the selection and evaluation of SWH sites.

  17. Applying TOGAF for e-government implementation based on service oriented architecture methodology towards good government governance

    NASA Astrophysics Data System (ADS)

    Hodijah, A.; Sundari, S.; Nugraha, A. C.

    2018-05-01

    As a Local Government Agencies who perform public services, General Government Office already has utilized Reporting Information System of Local Government Implementation (E-LPPD). However, E-LPPD has upgrade limitation for the integration processes that cannot accommodate General Government Offices’ needs in order to achieve Good Government Governance (GGG), while success stories of the ultimate goal of e-government implementation requires good governance practices. Currently, citizen demand public services as private sector do, which needs service innovation by utilizing the legacy system as a service based e-government implementation, while Service Oriented Architecture (SOA) to redefine a business processes as a set of IT enabled services and Enterprise Architecture from the Open Group Architecture Framework (TOGAF) as a comprehensive approach in redefining business processes as service innovation towards GGG. This paper takes a case study on Performance Evaluation of Local Government Implementation (EKPPD) system on General Government Office. The results show that TOGAF will guide the development of integrated business processes of EKPPD system that fits good governance practices to attain GGG with SOA methodology as technical approach.

  18. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.

    PubMed

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  19. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    PubMed Central

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method. PMID:25948132

  20. An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence

    NASA Technical Reports Server (NTRS)

    Lindley, Craig A.

    1993-01-01

    This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.

  1. A modified eco-efficiency framework and methodology for advancing the state of practice of sustainability analysis as applied to green infrastructure.

    PubMed

    Ghimire, Santosh R; Johnston, John M

    2017-09-01

    We propose a modified eco-efficiency (EE) framework and novel sustainability analysis methodology for green infrastructure (GI) practices used in water resource management. Green infrastructure practices such as rainwater harvesting (RWH), rain gardens, porous pavements, and green roofs are emerging as viable strategies for climate change adaptation. The modified framework includes 4 economic, 11 environmental, and 3 social indicators. Using 6 indicators from the framework, at least 1 from each dimension of sustainability, we demonstrate the methodology to analyze RWH designs. We use life cycle assessment and life cycle cost assessment to calculate the sustainability indicators of 20 design configurations as Decision Management Objectives (DMOs). Five DMOs emerged as relatively more sustainable along the EE analysis Tradeoff Line, and we used Data Envelopment Analysis (DEA), a widely applied statistical approach, to quantify the modified EE measures as DMO sustainability scores. We also addressed the subjectivity and sensitivity analysis requirements of sustainability analysis, and we evaluated the performance of 10 weighting schemes that included classical DEA, equal weights, National Institute of Standards and Technology's stakeholder panel, Eco-Indicator 99, Sustainable Society Foundation's Sustainable Society Index, and 5 derived schemes. We improved upon classical DEA by applying the weighting schemes to identify sustainability scores that ranged from 0.18 to 1.0, avoiding the nonuniqueness problem and revealing the least to most sustainable DMOs. Our methodology provides a more comprehensive view of water resource management and is generally applicable to GI and industrial, environmental, and engineered systems to explore the sustainability space of alternative design configurations. Integr Environ Assess Manag 2017;13:821-831. Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  2. From individual coping strategies to illness codification: the reflection of gender in social science research on multiple chemical sensitivities (MCS).

    PubMed

    Nadeau, Geneviève; Lippel, Katherine

    2014-09-10

    Emerging fields such as environmental health have been challenged, in recent years, to answer the growing methodological calls for a finer integration of sex and gender in health-related research and policy-making. Through a descriptive examination of 25 peer-reviewed social science papers published between 1996 and 2011, we explore, by examining methodological designs and theoretical standpoints, how the social sciences have integrated gender sensitivity in empirical work on Multiple Chemical Sensitivities (MCS). MCS is a "diagnosis" associated with sensitivities to chronic and low-dose chemical exposures, which remains contested in both the medical and institutional arenas, and is reported to disproportionately affect women. We highlighted important differences between papers that did integrate a gender lens and those that did not. These included characteristics of the authorship, purposes, theoretical frameworks and methodological designs of the studies. Reviewed papers that integrated gender tended to focus on the gender roles and identity of women suffering from MCS, emphasizing personal strategies of adaptation. More generally, terminological confusions in the use of sex and gender language and concepts, such as a conflation of women and gender, were observed. Although some men were included in most of the study samples reviewed, specific data relating to men was undereported in results and only one paper discussed issues specifically experienced by men suffering from MCS. Papers that overlooked gender dimensions generally addressed more systemic social issues such as the dynamics of expertise and the medical codification of MCS, from more consistently outlined theoretical frameworks. Results highlight the place for a critical, systematic and reflexive problematization of gender and for the development of methodological and theoretical tools on how to integrate gender in research designs when looking at both micro and macro social dimensions of environmental health conditions. This paper contributes to a discussion on the methodological and policy implications of taking sex and gender into account appropriately in order to contribute to better equity in health, especially where the critical social contexts of definition and medico-legal recognition play a major role such as in the case of MCS.

  3. Moral judgment as information processing: an integrative review.

    PubMed

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  4. Integrated city as a model for a new wave urban tourism

    NASA Astrophysics Data System (ADS)

    Ariani, V.

    2018-03-01

    Cities are a major player for an urban tourism destination. Massive tourism movement for urban tourism gains competitiveness to the city with similar characteristic. The new framework model for new wave urban tourism is crucial to give more experience to the tourist and valuing for the city itself. The integrated city is the answer for creating a new model for an urban tourism destination. The purpose of this preliminary research is to define integrated city framework for urban tourism development. It provides a rationale for tourism planner pursuing an innovative approach, competitive advantages, and general urban tourism destination model. The methodology applies to this research includes desk survey, literature review and focus group discussion. A conceptual framework is proposed, discussed and exemplified. The framework model adopts a place-based approach to tourism destination and suggests an integrated city model for urban tourism development. This model is a tool for strategy making in re-invention integrated city as an urban tourism destination.

  5. Moral judgment as information processing: an integrative review

    PubMed Central

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  6. Unified approach to redshift in cosmological/black hole spacetimes and synchronous frame

    NASA Astrophysics Data System (ADS)

    Toporensky, A. V.; Zaslavskii, O. B.; Popov, S. B.

    2018-01-01

    Usually, interpretation of redshift in static spacetimes (for example, near black holes) is opposed to that in cosmology. In this methodological note, we show that both explanations are unified in a natural picture. This is achieved if, considering the static spacetime, one (i) makes a transition to a synchronous frame, and (ii) returns to the original frame by means of local Lorentz boost. To reach our goal, we consider a rather general class of spherically symmetric spacetimes. In doing so, we construct frames that generalize the well-known Lemaitre and Painlevé-Gullstand ones and elucidate the relation between them. This helps us to understand, in a unifying approach, how gravitation reveals itself in different branches of general relativity. This framework can be useful for general relativity university courses.

  7. Interpersonal distance modeling during fighting activities.

    PubMed

    Dietrich, Gilles; Bredin, Jonathan; Kerlirzin, Yves

    2010-10-01

    The aim of this article is to elaborate a general framework for modeling dual opposition activities, or more generally, dual interaction. The main hypothesis is that opposition behavior can be measured directly from a global variable and that the relative distance between the two subjects can be this parameter. Moreover, this parameter should be considered as multidimensional parameter depending not only on the dynamics of the subjects but also on the "internal" parameters of the subjects, such as sociological and/or emotional states. Standard and simple mechanical formalization will be used to model this multifactorial distance. To illustrate such a general modeling methodology, this model was compared with actual data from an opposition activity like Japanese fencing (kendo). This model captures not only coupled coordination, but more generally interaction in two-subject activities.

  8. Measurement of Workload: Physics, Psychophysics, and Metaphysics

    NASA Technical Reports Server (NTRS)

    Gopher, D.

    1984-01-01

    The present paper reviews the results of two experiments in which workload analysis was conducted based upon performance measures, brain evoked potentials and magnitude estimations of subjective load. The three types of measures were jointly applied to the description of the behavior of subjects in a wide battery of experimental tasks. Data analysis shows both instances of association and dissociation between types of measures. A general conceptual framework and methodological guidelines are proposed to account for these findings.

  9. Development of an integrated economic and ecological framework for ecosystem-based fisheries management in New England

    NASA Astrophysics Data System (ADS)

    Jin, D.; Hoagland, P.; Dalton, T. M.; Thunberg, E. M.

    2012-09-01

    We present an integrated economic-ecological framework designed to help assess the implementation of ecosystem-based fisheries management (EBFM) in New England. We develop the framework by linking a computable general equilibrium (CGE) model of a coastal economy to an end-to-end (E2E) model of a marine food web for Georges Bank. We focus on the New England region using coastal county economic data for a restricted set of industry sectors and marine ecological data for three top level trophic feeding guilds: planktivores, benthivores, and piscivores. We undertake numerical simulations to model the welfare effects of changes in alternative combinations of yields from feeding guilds and alternative manifestations of biological productivity. We estimate the economic and distributional effects of these alternative simulations across a range of consumer income levels. This framework could be used to extend existing methodologies for assessing the impacts on human communities of groundfish stock rebuilding strategies, such as those expected through the implementation of the sector management program in the US northeast fishery. We discuss other possible applications of and modifications and limitations to the framework.

  10. A visual study of computers on doctors' desks.

    PubMed

    Pearce, Christopher; Walker, Hannah; O'Shea, Carolyn

    2008-01-01

    General practice has rapidly computerised over the past ten years, thereby changing the nature of general practice rooms. Most general practice consulting rooms were designed and created in an era without computer hardware, establishing a pattern of work around maximising the doctor-patient relationship. General practitioners (GPs) and patients have had to integrate the computer into this environment. Twenty GPs allowed access to their rooms and consultations as part of a larger study. The results are based on an analysis of still shots of the consulting rooms. Analysis used dramaturgical methodology; thus the room is described as though it is the setting for a play. First, several desk areas were identified: a shared or patient area, a working area, a clinical area and an administrative area. Then, within that framework, we were able to identify two broad categories of setting, one inclusive of the patient and one exclusive. With the increasing significance of the computer in the three-way doctor-patient-computer relationship, an understanding of the social milieu in which the three players in the consultation interact (the staging) will inform further analysis of the interaction, and allow a framework for assessing the effects of different computer placements.

  11. Don't fear 'fear conditioning': Methodological considerations for the design and analysis of studies on human fear acquisition, extinction, and return of fear.

    PubMed

    Lonsdorf, Tina B; Menz, Mareike M; Andreatta, Marta; Fullana, Miguel A; Golkar, Armita; Haaker, Jan; Heitland, Ivo; Hermann, Andrea; Kuhn, Manuel; Kruse, Onno; Meir Drexler, Shira; Meulders, Ann; Nees, Frauke; Pittig, Andre; Richter, Jan; Römer, Sonja; Shiban, Youssef; Schmitz, Anja; Straube, Benjamin; Vervliet, Bram; Wendt, Julia; Baas, Johanna M P; Merz, Christian J

    2017-06-01

    The so-called 'replicability crisis' has sparked methodological discussions in many areas of science in general, and in psychology in particular. This has led to recent endeavours to promote the transparency, rigour, and ultimately, replicability of research. Originating from this zeitgeist, the challenge to discuss critical issues on terminology, design, methods, and analysis considerations in fear conditioning research is taken up by this work, which involved representatives from fourteen of the major human fear conditioning laboratories in Europe. This compendium is intended to provide a basis for the development of a common procedural and terminology framework for the field of human fear conditioning. Whenever possible, we give general recommendations. When this is not feasible, we provide evidence-based guidance for methodological decisions on study design, outcome measures, and analyses. Importantly, this work is also intended to raise awareness and initiate discussions on crucial questions with respect to data collection, processing, statistical analyses, the impact of subtle procedural changes, and data reporting specifically tailored to the research on fear conditioning. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. On the Performance Evaluation of 3D Reconstruction Techniques from a Sequence of Images

    NASA Astrophysics Data System (ADS)

    Eid, Ahmed; Farag, Aly

    2005-12-01

    The performance evaluation of 3D reconstruction techniques is not a simple problem to solve. This is not only due to the increased dimensionality of the problem but also due to the lack of standardized and widely accepted testing methodologies. This paper presents a unified framework for the performance evaluation of different 3D reconstruction techniques. This framework includes a general problem formalization, different measuring criteria, and a classification method as a first step in standardizing the evaluation process. Performance characterization of two standard 3D reconstruction techniques, stereo and space carving, is also presented. The evaluation is performed on the same data set using an image reprojection testing methodology to reduce the dimensionality of the evaluation domain. Also, different measuring strategies are presented and applied to the stereo and space carving techniques. These measuring strategies have shown consistent results in quantifying the performance of these techniques. Additional experiments are performed on the space carving technique to study the effect of the number of input images and the camera pose on its performance.

  13. Detection and mapping of delays in early cortical folding derived from in utero MRI

    NASA Astrophysics Data System (ADS)

    Habas, Piotr A.; Rajagopalan, Vidya; Scott, Julia A.; Kim, Kio; Roosta, Ahmad; Rousseau, Francois; Barkovich, A. James; Glenn, Orit A.; Studholme, Colin

    2011-03-01

    Understanding human brain development in utero and detecting cortical abnormalities related to specific clinical conditions is an important area of research. In this paper, we describe and evaluate methodology for detection and mapping of delays in early cortical folding from population-based studies of fetal brain anatomies imaged in utero. We use a general linear modeling framework to describe spatiotemporal changes in curvature of the developing brain and explore the ability to detect and localize delays in cortical folding in the presence of uncertainty in estimation of the fetal age. We apply permutation testing to examine which regions of the brain surface provide the most statistical power to detect a given folding delay at a given developmental stage. The presented methodology is evaluated using MR scans of fetuses with normal brain development and gestational ages ranging from 20.57 to 27.86 weeks. This period is critical in early cortical folding and the formation of the primary and secondary sulci. Finally, we demonstrate a clinical application of the framework for detection and localization of folding delays in fetuses with isolated mild ventriculomegaly.

  14. Global/local methods research using a common structural analysis framework

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  15. Random forest feature selection approach for image segmentation

    NASA Astrophysics Data System (ADS)

    Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina; Vaida, Mircea Florin

    2017-03-01

    In the field of image segmentation, discriminative models have shown promising performance. Generally, every such model begins with the extraction of numerous features from annotated images. Most authors create their discriminative model by using many features without using any selection criteria. A more reliable model can be built by using a framework that selects the important variables, from the point of view of the classification, and eliminates the unimportant once. In this article we present a framework for feature selection and data dimensionality reduction. The methodology is built around the random forest (RF) algorithm and its variable importance evaluation. In order to deal with datasets so large as to be practically unmanageable, we propose an algorithm based on RF that reduces the dimension of the database by eliminating irrelevant features. Furthermore, this framework is applied to optimize our discriminative model for brain tumor segmentation.

  16. Symbolic interactionism in grounded theory studies: women surviving with HIV/AIDS in rural northern Thailand.

    PubMed

    Klunklin, Areewan; Greenwood, Jennifer

    2006-01-01

    Although it is generally acknowledged that symbolic interactionism and grounded theory are connected, the precise nature of their connection remains implicit and unexplained. As a result, many grounded theory studies are undertaken without an explanatory framework. This in turn results in the description rather than the explanation of data determined. In this report, the authors make explicit and explain the nature of the connections between symbolic interactionism and grounded theory research. Specifically, they make explicit the connection between Blumer's methodological principles and processes and grounded theory methodology. In addition, the authors illustrate the explanatory power of symbolic interactionism in grounded theory using data from a study of the HIV/AIDS experiences of married and widowed Thai women.

  17. Assessing the impact of healthcare research: A systematic review of methodological frameworks.

    PubMed

    Cruz Rivera, Samantha; Kyte, Derek G; Aiyegbusi, Olalekan Lee; Keeley, Thomas J; Calvert, Melanie J

    2017-08-01

    Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) 'primary research-related impact', (2) 'influence on policy making', (3) 'health and health systems impact', (4) 'health-related and societal impact', and (5) 'broader economic impact'. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study design decisions aimed at maximising the short-, medium-, and long-term impact of their research.

  18. Promoting Conditional Use of Communication Skills for Learners With Complex Communication Needs: A Tutorial.

    PubMed

    Simacek, Jessica; Reichle, Joe; Byiers, Breanne J; Parker-McGowan, Quannah; Dimian, Adele F; Elmquist, Marianne

    2018-05-03

    Conditional use of communication skills refers to the ability of a learner to appropriately generalize and discriminate when, where, and how to communicate based on constant variation and shifts in environmental cues. We describe discrimination and generalization challenges encountered by learners with complex communication needs and ways in which these challenges are fostered through traditional communication intervention programming. We address arrangements in instruction that maximize the probability of learners acquiring the conditional use of new vocabulary and the modest instructional technology implemented when planning for generalization. We propose establishing well-discriminated and generalized use of new vocabulary items through the application of a general case instruction framework to communication intervention programming. We provide intervention methodology, including intervention steps for general case instruction, a plethora of functional examples, and graphic displays to assess and intervene to promote conditional use of communication skills for learners with complex communication needs.

  19. Sustainable Supply Chain Design by the P-Graph Framework

    EPA Science Inventory

    The present work proposes a computer-aided methodology for designing sustainable supply chains in terms of sustainability metrics by resorting to the P-graph framework. The methodology is an outcome of the collaboration between the Office of Research and Development (ORD) of the ...

  20. On methodological standards in training and transfer experiments.

    PubMed

    Green, C Shawn; Strobach, Tilo; Schubert, Torsten

    2014-11-01

    The past two decades have seen a tremendous surge in scientific interest in the extent to which certain types of training-be it aerobic, athletic, musical, video game, or brain trainer-can result in general enhancements in cognitive function. While there are certainly active debates regarding the results in these domains, what is perhaps more pressing is the fact that key aspects of methodology remain unsettled. Here we discuss a few of these areas including expectation effects, test-retest effects, the size of the cognitive test battery, the selection of control groups, group assignment methods, difficulties in comparing results across studies, and in interpreting null results. Specifically, our goal is to highlight points of contention as well as areas where the most commonly utilized methods could be improved upon. Furthermore, because each of the sub-areas above (aerobic training through brain training) share strong similarities in goal, theoretical framework, and experimental approach, we seek to discuss these issues from a general perspective that considers each as members of the same broad "training" domain.

  1. Synthesis of Sustainable Energy Supply Chain by the P-Graph Framework

    EPA Science Inventory

    The present work proposes a computer-aided methodology for designing sustainable supply chains in terms of sustainability metrics by utilizing the P-graph framework. The methodology is an outcome of the collaboration between the Office of Research and Development (ORD) of the U.S...

  2. Methodology Evaluation Framework for Component-Based System Development.

    ERIC Educational Resources Information Center

    Dahanayake, Ajantha; Sol, Henk; Stojanovic, Zoran

    2003-01-01

    Explains component-based development (CBD) for distributed information systems and presents an evaluation framework, which highlights the extent to which a methodology is component oriented. Compares prominent CBD methods, discusses ways of modeling, and suggests that this is a first step towards a components-oriented systems development…

  3. Exploring How Globalization Shapes Education: Methodology and Theoretical Framework

    ERIC Educational Resources Information Center

    Pan, Su-Yan

    2010-01-01

    This is a commentary on some major issues raised in Carter and Dediwalage's "Globalisation and science education: The case of "Sustainability by the bay"" (this issue), particularly their methodology and theoretical framework for understanding how globalisation shapes education (including science education). While acknowledging the authors'…

  4. PRECEPT: an evidence assessment framework for infectious disease epidemiology, prevention and control.

    PubMed

    Harder, Thomas; Takla, Anja; Eckmanns, Tim; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Rehfuess, Eva; Schünemann, Holger; Zuiderent-Jerak, Teun; de Carvalho Gomes, Helena; Wichmann, Ole

    2017-10-01

    Decisions in public health should be based on the best available evidence, reviewed and appraised using a rigorous and transparent methodology. The Project on a Framework for Rating Evidence in Public Health (PRECEPT) defined a methodology for evaluating and grading evidence in infectious disease epidemiology, prevention and control that takes different domains and question types into consideration. The methodology rates evidence in four domains: disease burden, risk factors, diagnostics and intervention. The framework guiding it has four steps going from overarching questions to an evidence statement. In step 1, approaches for identifying relevant key areas and developing specific questions to guide systematic evidence searches are described. In step 2, methodological guidance for conducting systematic reviews is provided; 15 study quality appraisal tools are proposed and an algorithm is given for matching a given study design with a tool. In step 3, a standardised evidence-grading scheme using the Grading of Recommendations Assessment, Development and Evaluation Working Group (GRADE) methodology is provided, whereby findings are documented in evidence profiles. Step 4 consists of preparing a narrative evidence summary. Users of this framework should be able to evaluate and grade scientific evidence from the four domains in a transparent and reproducible way.

  5. PRECEPT: an evidence assessment framework for infectious disease epidemiology, prevention and control

    PubMed Central

    Harder, Thomas; Takla, Anja; Eckmanns, Tim; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Rehfuess, Eva; Schünemann, Holger; Zuiderent-Jerak, Teun; de Carvalho Gomes, Helena; Wichmann, Ole

    2017-01-01

    Decisions in public health should be based on the best available evidence, reviewed and appraised using a rigorous and transparent methodology. The Project on a Framework for Rating Evidence in Public Health (PRECEPT) defined a methodology for evaluating and grading evidence in infectious disease epidemiology, prevention and control that takes different domains and question types into consideration. The methodology rates evidence in four domains: disease burden, risk factors, diagnostics and intervention. The framework guiding it has four steps going from overarching questions to an evidence statement. In step 1, approaches for identifying relevant key areas and developing specific questions to guide systematic evidence searches are described. In step 2, methodological guidance for conducting systematic reviews is provided; 15 study quality appraisal tools are proposed and an algorithm is given for matching a given study design with a tool. In step 3, a standardised evidence-grading scheme using the Grading of Recommendations Assessment, Development and Evaluation Working Group (GRADE) methodology is provided, whereby findings are documented in evidence profiles. Step 4 consists of preparing a narrative evidence summary. Users of this framework should be able to evaluate and grade scientific evidence from the four domains in a transparent and reproducible way. PMID:29019317

  6. Agile methodology selection criteria: IT start-up case study

    NASA Astrophysics Data System (ADS)

    Micic, Lj

    2017-05-01

    Project management in modern IT companies is often based on agile methodologies which have several advantages compared to traditional methodologies such is waterfall. Having in mind that clients sometimes change project during development it is crucial for an IT company to choose carefully which methodology is going to implement and is it going to be mostly based on one or is it going got be combination of several. There are several modern and often used methodologies but among those Scrum, Kanban and XP programming are usually the most common. Sometimes companies use mostly tools and procedures from one but quite often they use some of the combination of those methodologies. Having in mind that those methodologies are just a framework they allow companies to adapt it for their specific projects as well as for other limitations. These methodologies are in limited usage Bosnia but more and more IT companies are starting to use agile methodologies because it is practice and common not just for their clients abroad but also starting to be the only option in order to deliver quality product on time. However it is always challenging which methodology or combination of several companies should implement and how to connect it to its own project, organizational framework and HR management. This paper presents one case study based on local IT start up and delivers solution based on theoretical framework and practical limitations that case company has.

  7. What is adaptive about adaptive decision making? A parallel constraint satisfaction account.

    PubMed

    Glöckner, Andreas; Hilbig, Benjamin E; Jekel, Marc

    2014-12-01

    There is broad consensus that human cognition is adaptive. However, the vital question of how exactly this adaptivity is achieved has remained largely open. Herein, we contrast two frameworks which account for adaptive decision making, namely broad and general single-mechanism accounts vs. multi-strategy accounts. We propose and fully specify a single-mechanism model for decision making based on parallel constraint satisfaction processes (PCS-DM) and contrast it theoretically and empirically against a multi-strategy account. To achieve sufficiently sensitive tests, we rely on a multiple-measure methodology including choice, reaction time, and confidence data as well as eye-tracking. Results show that manipulating the environmental structure produces clear adaptive shifts in choice patterns - as both frameworks would predict. However, results on the process level (reaction time, confidence), in information acquisition (eye-tracking), and from cross-predicting choice consistently corroborate single-mechanisms accounts in general, and the proposed parallel constraint satisfaction model for decision making in particular. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Experimental Validation of L1 Adaptive Control: Rohrs' Counterexample in Flight

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Issac; Kitsios, Ioannis; Cao, Chengyu; Gregory, Irene M.; Valavani, Lena

    2010-01-01

    The paper presents new results on the verification and in-flight validation of an L1 adaptive flight control system, and proposes a general methodology for verification and validation of adaptive flight control algorithms. The proposed framework is based on Rohrs counterexample, a benchmark problem presented in the early 80s to show the limitations of adaptive controllers developed at that time. In this paper, the framework is used to evaluate the performance and robustness characteristics of an L1 adaptive control augmentation loop implemented onboard a small unmanned aerial vehicle. Hardware-in-the-loop simulations and flight test results confirm the ability of the L1 adaptive controller to maintain stability and predictable performance of the closed loop adaptive system in the presence of general (artificially injected) unmodeled dynamics. The results demonstrate the advantages of L1 adaptive control as a verifiable robust adaptive control architecture with the potential of reducing flight control design costs and facilitating the transition of adaptive control into advanced flight control systems.

  9. A hierarchical-multiobjective framework for risk management

    NASA Technical Reports Server (NTRS)

    Haimes, Yacov Y.; Li, Duan

    1991-01-01

    A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.

  10. A general regression framework for a secondary outcome in case-control studies.

    PubMed

    Tchetgen Tchetgen, Eric J

    2014-01-01

    Modern case-control studies typically involve the collection of data on a large number of outcomes, often at considerable logistical and monetary expense. These data are of potentially great value to subsequent researchers, who, although not necessarily concerned with the disease that defined the case series in the original study, may want to use the available information for a regression analysis involving a secondary outcome. Because cases and controls are selected with unequal probability, regression analysis involving a secondary outcome generally must acknowledge the sampling design. In this paper, the author presents a new framework for the analysis of secondary outcomes in case-control studies. The approach is based on a careful re-parameterization of the conditional model for the secondary outcome given the case-control outcome and regression covariates, in terms of (a) the population regression of interest of the secondary outcome given covariates and (b) the population regression of the case-control outcome on covariates. The error distribution for the secondary outcome given covariates and case-control status is otherwise unrestricted. For a continuous outcome, the approach sometimes reduces to extending model (a) by including a residual of (b) as a covariate. However, the framework is general in the sense that models (a) and (b) can take any functional form, and the methodology allows for an identity, log or logit link function for model (a).

  11. A modified eco-efficiency framework and methodology for advancing the state of practice of sustainability analysis as applied to green infrastructure

    EPA Science Inventory

    We propose a modified eco-efficiency (EE) framework and novel sustainability analysis methodology for green infrastructure (GI) practices used in water resource management. Green infrastructure practices such as rainwater harvesting (RWH), rain gardens, porous pavements, and gree...

  12. IT Portfolio Selection and IT Synergy

    ERIC Educational Resources Information Center

    Cho, Woo Je

    2010-01-01

    This dissertation consists of three chapters. The primary objectives of this dissertation are: (1) to provide a methodological framework of IT (Information Technology) portfolio management, and (2) to identify the effect of IT synergy on IT portfolio selection of a firm. The first chapter presents a methodological framework for IT project…

  13. A decomposition model and voxel selection framework for fMRI analysis to predict neural response of visual stimuli.

    PubMed

    Raut, Savita V; Yadav, Dinkar M

    2018-03-28

    This paper presents an fMRI signal analysis methodology using geometric mean curve decomposition (GMCD) and mutual information-based voxel selection framework. Previously, the fMRI signal analysis has been conducted using empirical mean curve decomposition (EMCD) model and voxel selection on raw fMRI signal. The erstwhile methodology loses frequency component, while the latter methodology suffers from signal redundancy. Both challenges are addressed by our methodology in which the frequency component is considered by decomposing the raw fMRI signal using geometric mean rather than arithmetic mean and the voxels are selected from EMCD signal using GMCD components, rather than raw fMRI signal. The proposed methodologies are adopted for predicting the neural response. Experimentations are conducted in the openly available fMRI data of six subjects, and comparisons are made with existing decomposition models and voxel selection frameworks. Subsequently, the effect of degree of selected voxels and the selection constraints are analyzed. The comparative results and the analysis demonstrate the superiority and the reliability of the proposed methodology.

  14. How to Measure Costs and Benefits of eHealth Interventions: An Overview of Methods and Frameworks.

    PubMed

    Bergmo, Trine Strand

    2015-11-09

    Information on the costs and benefits of eHealth interventions is needed, not only to document value for money and to support decision making in the field, but also to form the basis for developing business models and to facilitate payment systems to support large-scale services. In the absence of solid evidence of its effects, key decision makers may doubt the effectiveness, which, in turn, limits investment in, and the long-term integration of, eHealth services. However, it is not realistic to conduct economic evaluations of all eHealth applications and services in all situations, so we need to be able to generalize from those we do conduct. This implies that we have to select the most appropriate methodology and data collection strategy in order to increase the transferability across evaluations. This paper aims to contribute to the understanding of how to apply economic evaluation methodology in the eHealth field. It provides a brief overview of basic health economics principles and frameworks and discusses some methodological issues and challenges in conducting cost-effectiveness analysis of eHealth interventions. Issues regarding the identification, measurement, and valuation of costs and benefits are outlined. Furthermore, this work describes the established techniques of combining costs and benefits, presents the decision rules for identifying the preferred option, and outlines approaches to data collection strategies. Issues related to transferability and complexity are also discussed.

  15. Bayesian data fusion for spatial prediction of categorical variables in environmental sciences

    NASA Astrophysics Data System (ADS)

    Gengler, Sarah; Bogaert, Patrick

    2014-12-01

    First developed to predict continuous variables, Bayesian Maximum Entropy (BME) has become a complete framework in the context of space-time prediction since it has been extended to predict categorical variables and mixed random fields. This method proposes solutions to combine several sources of data whatever the nature of the information. However, the various attempts that were made for adapting the BME methodology to categorical variables and mixed random fields faced some limitations, as a high computational burden. The main objective of this paper is to overcome this limitation by generalizing the Bayesian Data Fusion (BDF) theoretical framework to categorical variables, which is somehow a simplification of the BME method through the convenient conditional independence hypothesis. The BDF methodology for categorical variables is first described and then applied to a practical case study: the estimation of soil drainage classes using a soil map and point observations in the sandy area of Flanders around the city of Mechelen (Belgium). The BDF approach is compared to BME along with more classical approaches, as Indicator CoKringing (ICK) and logistic regression. Estimators are compared using various indicators, namely the Percentage of Correctly Classified locations (PCC) and the Average Highest Probability (AHP). Although BDF methodology for categorical variables is somehow a simplification of BME approach, both methods lead to similar results and have strong advantages compared to ICK and logistic regression.

  16. A framework for grouping nanoparticles based on their measurable characteristics.

    PubMed

    Sayes, Christie M; Smith, P Alex; Ivanov, Ivan V

    2013-01-01

    There is a need to take a broader look at nanotoxicological studies. Eventually, the field will demand that some generalizations be made. To begin to address this issue, we posed a question: are metal colloids on the nanometer-size scale a homogeneous group? In general, most people can agree that the physicochemical properties of nanomaterials can be linked and related to their induced toxicological responses. The focus of this study was to determine how a set of selected physicochemical properties of five specific metal-based colloidal materials on the nanometer-size scale - silver, copper, nickel, iron, and zinc - could be used as nanodescriptors that facilitate the grouping of these metal-based colloids. The example of the framework pipeline processing provided in this paper shows the utility of specific statistical and pattern recognition techniques in grouping nanoparticles based on experimental data about their physicochemical properties. Interestingly, the results of the analyses suggest that a seemingly homogeneous group of nanoparticles could be separated into sub-groups depending on interdependencies observed in their nanodescriptors. These particles represent an important category of nanomaterials that are currently mass produced. Each has been reputed to induce toxicological and/or cytotoxicological effects. Here, we propose an experimental methodology coupled with mathematical and statistical modeling that can serve as a prototype for a rigorous framework that aids in the ability to group nanomaterials together and to facilitate the subsequent analysis of trends in data based on quantitative modeling of nanoparticle-specific structure-activity relationships. The computational part of the proposed framework is rather general and can be applied to other groups of nanomaterials as well.

  17. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care

    PubMed Central

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988

  18. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.

    PubMed

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.

  19. Advancing the integration of spatial data to map human and natural drivers on coral reefs

    PubMed Central

    Gove, Jamison M.; Walecka, Hilary R.; Donovan, Mary K.; Williams, Gareth J.; Jouffray, Jean-Baptiste; Crowder, Larry B.; Erickson, Ashley; Falinski, Kim; Friedlander, Alan M.; Kappel, Carrie V.; Kittinger, John N.; McCoy, Kaylyn; Norström, Albert; Nyström, Magnus; Oleson, Kirsten L. L.; Stamoulis, Kostantinos A.; White, Crow; Selkoe, Kimberly A.

    2018-01-01

    A major challenge for coral reef conservation and management is understanding how a wide range of interacting human and natural drivers cumulatively impact and shape these ecosystems. Despite the importance of understanding these interactions, a methodological framework to synthesize spatially explicit data of such drivers is lacking. To fill this gap, we established a transferable data synthesis methodology to integrate spatial data on environmental and anthropogenic drivers of coral reefs, and applied this methodology to a case study location–the Main Hawaiian Islands (MHI). Environmental drivers were derived from time series (2002–2013) of climatological ranges and anomalies of remotely sensed sea surface temperature, chlorophyll-a, irradiance, and wave power. Anthropogenic drivers were characterized using empirically derived and modeled datasets of spatial fisheries catch, sedimentation, nutrient input, new development, habitat modification, and invasive species. Within our case study system, resulting driver maps showed high spatial heterogeneity across the MHI, with anthropogenic drivers generally greatest and most widespread on O‘ahu, where 70% of the state’s population resides, while sedimentation and nutrients were dominant in less populated islands. Together, the spatial integration of environmental and anthropogenic driver data described here provides a first-ever synthetic approach to visualize how the drivers of coral reef state vary in space and demonstrates a methodological framework for implementation of this approach in other regions of the world. By quantifying and synthesizing spatial drivers of change on coral reefs, we provide an avenue for further research to understand how drivers determine reef diversity and resilience, which can ultimately inform policies to protect coral reefs. PMID:29494613

  20. The conceptual framework and assessment methodology for the systematic reviews of community-based interventions for the prevention and control of infectious diseases of poverty.

    PubMed

    Lassi, Zohra S; Salam, Rehana A; Das, Jai K; Bhutta, Zulfiqar A

    2014-01-01

    This paper describes the conceptual framework and the methodology used to guide the systematic reviews of community-based interventions (CBIs) for the prevention and control of infectious diseases of poverty (IDoP). We adapted the conceptual framework from the 3ie work on the 'Community-Based Intervention Packages for Preventing Maternal Morbidity and Mortality and Improving Neonatal Outcomes' to aid in the analyzing of the existing CBIs for IDoP. The conceptual framework revolves around objectives, inputs, processes, outputs, outcomes, and impacts showing the theoretical linkages between the delivery of the interventions targeting these diseases through various community delivery platforms and the consequent health impacts. We also describe the methodology undertaken to conduct the systematic reviews and the meta-analyses.

  1. A graph-based approach to detect spatiotemporal dynamics in satellite image time series

    NASA Astrophysics Data System (ADS)

    Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal

    2017-08-01

    Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.

  2. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    NASA Astrophysics Data System (ADS)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  3. A Framework for Integrating Oceanographic Data Repositories

    NASA Astrophysics Data System (ADS)

    Rozell, E.; Maffei, A. R.; Beaulieu, S. E.; Fox, P. A.

    2010-12-01

    Oceanographic research covers a broad range of science domains and requires a tremendous amount of cross-disciplinary collaboration. Advances in cyberinfrastructure are making it easier to share data across disciplines through the use of web services and community vocabularies. Best practices in the design of web services and vocabularies to support interoperability amongst science data repositories are only starting to emerge. Strategic design decisions in these areas are crucial to the creation of end-user data and application integration tools. We present S2S, a novel framework for deploying customizable user interfaces to support the search and analysis of data from multiple repositories. Our research methods follow the Semantic Web methodology and technology development process developed by Fox et al. This methodology stresses the importance of close scientist-technologist interactions when developing scientific use cases, keeping the project well scoped and ensuring the result meets a real scientific need. The S2S framework motivates the development of standardized web services with well-described parameters, as well as the integration of existing web services and applications in the search and analysis of data. S2S also encourages the use and development of community vocabularies and ontologies to support federated search and reduce the amount of domain expertise required in the data discovery process. S2S utilizes the Web Ontology Language (OWL) to describe the components of the framework, including web service parameters, and OpenSearch as a standard description for web services, particularly search services for oceanographic data repositories. We have created search services for an oceanographic metadata database, a large set of quality-controlled ocean profile measurements, and a biogeographic search service. S2S provides an application programming interface (API) that can be used to generate custom user interfaces, supporting data and application integration across these repositories and other web resources. Although initially targeted towards a general oceanographic audience, the S2S framework shows promise in many science domains, inspired in part by the broad disciplinary coverage of oceanography. This presentation will cover the challenges addressed by the S2S framework, the research methods used in its development, and the resulting architecture for the system. It will demonstrate how S2S is remarkably extensible, and can be generalized to many science domains. Given these characteristics, the framework can simplify the process of data discovery and analysis for the end user, and can help to shift the responsibility of search interface development away from data managers.

  4. Harnessing the Power of Education Research Databases with the Pearl-Harvesting Methodological Framework for Information Retrieval

    ERIC Educational Resources Information Center

    Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter

    2010-01-01

    Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…

  5. Application of Resource Description Framework to Personalise Learning: Systematic Review and Methodology

    ERIC Educational Resources Information Center

    Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus

    2017-01-01

    The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…

  6. Review article: A systematic review of emergency department incident classification frameworks.

    PubMed

    Murray, Matthew; McCarthy, Sally

    2018-06-01

    As in any part of the hospital system, safety incidents can occur in the ED. These incidents arguably have a distinct character, as the ED involves unscheduled flows of urgent patients who require disparate services. To aid understanding of safety issues and support risk management of the ED, a comparison of published ED specific incident classification frameworks was performed. A review of emergency medicine, health management and general medical publications, using Ovid SP to interrogate Medline (1976-2016) was undertaken to identify any type of taxonomy or classification-like framework for ED related incidents. These frameworks were then analysed and compared. The review identified 17 publications containing an incident classification framework. Comparison of factors and themes making up the classification constituent elements revealed some commonality, but no overall consistency, nor evolution towards an ideal framework. Inconsistency arises from differences in the evidential basis and design methodology of classifications, with design itself being an inherently subjective process. It was not possible to identify an 'ideal' incident classification framework for ED risk management, and there is significant variation in the selection of categories used by frameworks. The variation in classification could risk an unbalanced emphasis in findings through application of a particular framework. Design of an ED specific, ideal incident classification framework should be informed by a much wider range of theories of how organisations and systems work, in addition to clinical and human factors. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  7. A modeling framework for exposing risks in complex systems.

    PubMed

    Sharit, J

    2000-08-01

    This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.

  8. A linear framework for time-scale separation in nonlinear biochemical systems.

    PubMed

    Gunawardena, Jeremy

    2012-01-01

    Cellular physiology is implemented by formidably complex biochemical systems with highly nonlinear dynamics, presenting a challenge for both experiment and theory. Time-scale separation has been one of the few theoretical methods for distilling general principles from such complexity. It has provided essential insights in areas such as enzyme kinetics, allosteric enzymes, G-protein coupled receptors, ion channels, gene regulation and post-translational modification. In each case, internal molecular complexity has been eliminated, leading to rational algebraic expressions among the remaining components. This has yielded familiar formulas such as those of Michaelis-Menten in enzyme kinetics, Monod-Wyman-Changeux in allostery and Ackers-Johnson-Shea in gene regulation. Here we show that these calculations are all instances of a single graph-theoretic framework. Despite the biochemical nonlinearity to which it is applied, this framework is entirely linear, yet requires no approximation. We show that elimination of internal complexity is feasible when the relevant graph is strongly connected. The framework provides a new methodology with the potential to subdue combinatorial explosion at the molecular level.

  9. Assessing the impact of healthcare research: A systematic review of methodological frameworks

    PubMed Central

    Keeley, Thomas J.; Calvert, Melanie J.

    2017-01-01

    Background Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Methods and findings Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) ‘primary research-related impact’, (2) ‘influence on policy making’, (3) ‘health and health systems impact’, (4) ‘health-related and societal impact’, and (5) ‘broader economic impact’. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. Conclusions The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study design decisions aimed at maximising the short-, medium-, and long-term impact of their research. PMID:28792957

  10. Agile Software Development in the Department of Defense Environment

    DTIC Science & Technology

    2017-03-31

    Research Methodology .............................................................................................. 17 Research Hypothesis...acquisition framework to enable greater adoption of Agile methodologies . Overview of the Research Methodology The strategy for this study was to...guidance. 17 Chapter 3 – Research Methodology This chapter defines the research methodology and processes used in the study, in an effort to

  11. Assessing Similarity Among Individual Tumor Size Lesion Dynamics: The CICIL Methodology

    PubMed Central

    Girard, Pascal; Ioannou, Konstantinos; Klinkhardt, Ute; Munafo, Alain

    2018-01-01

    Mathematical models of tumor dynamics generally omit information on individual target lesions (iTLs), and consider the most important variable to be the sum of tumor sizes (TS). However, differences in lesion dynamics might be predictive of tumor progression. To exploit this information, we have developed a novel and flexible approach for the non‐parametric analysis of iTLs, which integrates knowledge from signal processing and machine learning. We called this new methodology ClassIfication Clustering of Individual Lesions (CICIL). We used CICIL to assess similarities among the TS dynamics of 3,223 iTLs measured in 1,056 patients with metastatic colorectal cancer treated with cetuximab combined with irinotecan, in two phase II studies. We mainly observed similar dynamics among lesions within the same tumor site classification. In contrast, lesions in anatomic locations with different features showed different dynamics in about 35% of patients. The CICIL methodology has also been implemented in a user‐friendly and efficient Java‐based framework. PMID:29388396

  12. Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.

    PubMed

    Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie

    2017-12-01

    Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.

  13. Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach: The Methodology.

    PubMed

    Jaciw, Andrew P

    2016-06-01

    Various studies have examined bias in impact estimates from comparison group studies (CGSs) of job training programs, and in education, where results are benchmarked against experimental results. Such within-study comparison (WSC) approaches investigate levels of bias in CGS-based impact estimates, as well as the success of various design and analytic strategies for reducing bias. This article reviews past literature and summarizes conditions under which CGSs replicate experimental benchmark results. It extends the framework to, and develops the methodology for, situations where results from CGSs are generalized to untreated inference populations. Past research is summarized; methods are developed to examine bias in program impact estimates based on cross-site comparisons in a multisite trial that are evaluated against site-specific experimental benchmarks. Students in Grades K-3 in 79 schools in Tennessee; students in Grades 4-8 in 82 schools in Alabama. Grades K-3 Stanford Achievement Test (SAT) in reading and math scores; Grades 4-8 SAT10 reading scores. Past studies show that bias in CGS-based estimates can be limited through strong design, with local matching, and appropriate analysis involving pretest covariates and variables that represent selection processes. Extension of the methodology to investigate accuracy of generalized estimates from CGSs shows bias from confounders and effect moderators. CGS results, when extrapolated to untreated inference populations, may be biased due to variation in outcomes and impact. Accounting for effects of confounders or moderators may reduce bias. © The Author(s) 2016.

  14. A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.

    PubMed

    Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan

    2017-12-01

    A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  15. An implementation framework for wastewater treatment models requiring a minimum programming expertise.

    PubMed

    Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J

    2009-01-01

    Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.

  16. A Conceptual Framework for Systematic Reviews of Research in Educational Leadership and Management

    ERIC Educational Resources Information Center

    Hallinger, Philip

    2013-01-01

    Purpose: The purpose of this paper is to present a framework for scholars carrying out reviews of research that meet international standards for publication. Design/methodology/approach: This is primarily a conceptual paper focusing on the methodology of conducting systematic reviews of research. However, the paper draws on a database of reviews…

  17. Unmanned Tactical Autonomous Control and Collaboration Situation Awareness

    DTIC Science & Technology

    2017-06-01

    methodology framework using interdependence analysis (IA) tables for informing design requirements based on SA requirements. Future research should seek...requirements of UTACC. The authors then apply SA principles to Coactive Design in order to inform robotic design. The result is a methodology framework using...28  2.  Non -intrusive Methods ................................................................29  3.  Post-Mission Reviews

  18. The Perceptions of U.S.-Based IT Security Professionals about the Effectiveness of IT Security Frameworks: A Quantitative Study

    ERIC Educational Resources Information Center

    Warfield, Douglas L.

    2011-01-01

    The evolution of information technology has included new methodologies that use information technology to control and manage various industries and government activities. Information Technology has also evolved as its own industry with global networks of interconnectivity, such as the Internet, and frameworks, models, and methodologies to control…

  19. Integrating Social Activity Theory and Critical Discourse Analysis: A Multilayered Methodological Model for Examining Knowledge Mediation in Mentoring

    ERIC Educational Resources Information Center

    Becher, Ayelet; Orland-Barak, Lily

    2016-01-01

    This study suggests an integrative qualitative methodological framework for capturing complexity in mentoring activity. Specifically, the model examines how historical developments of a discipline direct mentors' mediation of professional knowledge through the language that they use. The model integrates social activity theory and a framework of…

  20. Developing an evidence-based methodological framework to systematically compare HTA coverage decisions: A mixed methods study.

    PubMed

    Nicod, Elena; Kanavos, Panos

    2016-01-01

    Health Technology Assessment (HTA) often results in different coverage recommendations across countries for a same medicine despite similar methodological approaches. This paper develops and pilots a methodological framework that systematically identifies the reasons for these differences using an exploratory sequential mixed methods research design. The study countries were England, Scotland, Sweden and France. The methodological framework was built around three stages of the HTA process: (a) evidence, (b) its interpretation, and (c) its influence on the final recommendation; and was applied to two orphan medicinal products. The criteria accounted for at each stage were qualitatively analyzed through thematic analysis. Piloting the framework for two medicines, eight trials, 43 clinical endpoints and seven economic models were coded 155 times. Eighteen different uncertainties about this evidence were coded 28 times, 56% of which pertained to evidence commonly appraised and 44% to evidence considered by only some agencies. The poor agreement in interpreting this evidence (κ=0.183) was partly explained by stakeholder input (ns=48 times), or by agency-specific risk (nu=28 uncertainties) and value preferences (noc=62 "other considerations"), derived through correspondence analysis. Accounting for variability at each stage of the process can be achieved by codifying its existence and quantifying its impact through the application of this framework. The transferability of this framework to other disease areas, medicines and countries is ensured by its iterative and flexible nature, and detailed description. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  2. Virtual Levels and Role Models: N-Level Structural Equations Model of Reciprocal Ratings Data.

    PubMed

    Mehta, Paras D

    2018-01-01

    A general latent variable modeling framework called n-Level Structural Equations Modeling (NL-SEM) for dependent data-structures is introduced. NL-SEM is applicable to a wide range of complex multilevel data-structures (e.g., cross-classified, switching membership, etc.). Reciprocal dyadic ratings obtained in round-robin design involve complex set of dependencies that cannot be modeled within Multilevel Modeling (MLM) or Structural Equations Modeling (SEM) frameworks. The Social Relations Model (SRM) for round robin data is used as an example to illustrate key aspects of the NL-SEM framework. NL-SEM introduces novel constructs such as 'virtual levels' that allows a natural specification of latent variable SRMs. An empirical application of an explanatory SRM for personality using xxM, a software package implementing NL-SEM is presented. Results show that person perceptions are an integral aspect of personality. Methodological implications of NL-SEM for the analyses of an emerging class of contextual- and relational-SEMs are discussed.

  3. Review of Building Data Frameworks across Countries: Lessons for India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iyer, Maithili; Stratton, Hannah; Mathew, Sangeeta

    The report outlines the initial explorations carried out by LBNL on available examples of energy data collection frameworks for buildings. Specifically, this monograph deals with European experience in the buildings sector, the US experience in the commercial buildings sector, and examples of data collection effort in Singapore and China to capture the Asian experience in the commercial sector. The review also provides a summary of the past efforts in India to collect and use commercial building energy data and its strengths and weaknesses. The overall aim of this activity is to help understand the use cases that drive the granularitymore » of data being collected and the range of methodologies adopted for the data collection effort. This review is a key input and reference for developing a data collection framework for India, and also clarifies general thinking on the institutional structure that may be amenable for data collection effort to match the needs and requirements of commercial building sector in India.« less

  4. On the pursuit of a nuclear development capability: The case of the Cuban nuclear program

    NASA Astrophysics Data System (ADS)

    Benjamin-Alvarado, Jonathan Calvert

    1998-09-01

    While there have been many excellent descriptive accounts of modernization schemes in developing states, energy development studies based on prevalent modernization theory have been rare. Moreover, heretofore there have been very few analyses of efforts to develop a nuclear energy capability by developing states. Rarely have these analyses employed social science research methodologies. The purpose of this study was to develop a general analytical framework, based on such a methodology to analyze nuclear energy development and to utilize this framework for the study of the specific case of Cuba's decision to develop nuclear energy. The analytical framework developed focuses on a qualitative tracing of the process of Cuban policy objectives and implementation to develop a nuclear energy capability, and analyzes the policy in response to three models of modernization offered to explain the trajectory of policy development. These different approaches are the politically motivated modernization model, the economic and technological modernization model and the economic and energy security model. Each model provides distinct and functionally differentiated expectations for the path of development toward this objective. Each model provides expected behaviors to external stimuli that would result in specific policy responses. In the study, Cuba's nuclear policy responses to stimuli from domestic constraints and intensities, institutional development, and external influences are analyzed. The analysis revealed that in pursuing the nuclear energy capability, Cuba primarily responded by filtering most of the stimuli through the twin objectives of economic rationality and technological advancement. Based upon the Cuban policy responses to the domestic and international stimuli, the study concluded that the economic and technological modernization model of nuclear energy development offered a more complete explanation of the trajectory of policy development than either the politically-motivated or economic and energy security models. The findings of this case pose some interesting questions for the general study of energy programs in developing states. By applying the analytical framework employed in this study to a number of other cases, perhaps the understanding of energy development schemes may be expanded through future research.

  5. Health risks of energy systems.

    PubMed

    Krewitt, W; Hurley, F; Trukenmüller, A; Friedrich, R

    1998-08-01

    Health risks from fossil, renewable and nuclear reference energy systems are estimated following a detailed impact pathway approach. Using a set of appropriate air quality models and exposure-effect functions derived from the recent epidemiological literature, a methodological framework for risk assessment has been established and consistently applied across the different energy systems, including the analysis of consequences from a major nuclear accident. A wide range of health impacts resulting from increased air pollution and ionizing radiation is quantified, and the transferability of results derived from specific power plants to a more general context is discussed.

  6. An investigation into the use of recorded music as a surgical intervention: A systematic, critical review of methodologies used in recent adult controlled trials.

    PubMed

    Williams, Courtney; Hine, Trevor

    2018-04-01

    While music is being increasingly used as a surgical intervention, the types of music used and the reasons underlying their selection remain inconsistent. Empirical research into the efficacy of such musical interventions is therefore problematic. To provide clear guidelines for musical selection and employment in surgical interventions, created through a synthesis of the literature. The aim is to examine how music is implemented in surgical situations, and to provide guidance for the selection and composition of music for future interventions. English language quantitative surgical intervention studies from Science Direct, ProQuest, and Sage Journals Online, all published within the last 10 years and featuring recorded music, were systematically reviewed. Variables investigated included: the time the intervention was performed, the intervention length, the outcomes targeted, music description (general and specific), theoretical frameworks underlying the selection of the music, whether or not a musical expert was involved, participant music history, and the participants' feedback on the chosen music. Several aspects contribute to the lack of scientific rigour regarding music selection in this field, including the lack of a theoretical framework or frameworks, no involvement of musical experts, failure to list the music tracks used, and the use of vague and subjective terms in general music descriptions. Patients are frequently allowed to select music (risking both choosing music that has an adverse effect and making study replication difficult), and patient music history and listening habits are rarely considered. Crucially, five primary theoretical frameworks underlying the effectiveness of music arose in the literature (distraction, relaxation, emotional shift, entrainment, and endogenous analgesia), however music was rarely selected to enhance any of these mechanisms. Further research needs to be conducted to ensure that music is selected according to a theoretical framework and more rigorous and replicable methodology. Music interventions can be made more effective at improving psychological states and reducing physiological arousal by selecting music conducive to specific mechanisms, and also by considering at what point during the surgical experience the music would be most effective. Greater involvement of music experts in interventions would help to ensure that the most appropriate music was chosen, and that it is clearly and precisely described. Copyright © 2018. Published by Elsevier Ltd.

  7. A network-base analysis of CMIP5 "historical" experiments

    NASA Astrophysics Data System (ADS)

    Bracco, A.; Foudalis, I.; Dovrolis, C.

    2012-12-01

    In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.

  8. Establishing a Culture of Learning: A Mixed Methodology Exploration regarding the Phases of Change for Professional Learning Communities and Literacy Strategies

    ERIC Educational Resources Information Center

    Gray, James E.

    2010-01-01

    This research serves as a mixed methodological study that presents a conceptual framework which focuses on the relationship between professional learning communities, high yield literacy strategies, and their phases of change. As a result, the purpose of this study is threefold. First, a conceptual framework integrating professional learning…

  9. Comparing the Similarities and Differences of PISA 2003 and TIMSS. OECD Education Working Papers, No. 32

    ERIC Educational Resources Information Center

    Wu, Margaret

    2010-01-01

    This paper makes an in-depth comparison of the PISA (OECD) and TIMSS (IEA) mathematics assessments conducted in 2003. First, a comparison of survey methodologies is presented, followed by an examination of the mathematics frameworks in the two studies. The methodologies and the frameworks in the two studies form the basis for providing…

  10. When Going Hybrid Is Not Enough: Statistical Analysis of Effectiveness of Blended Courses Piloted within Tempus BLATT Project

    ERIC Educational Resources Information Center

    Jovanovic, Aleksandar; Jankovic, Anita; Jovanovic, Snezana Markovic; Peric, Vladan; Vitosevic, Biljana; Pavlovic, Milos

    2015-01-01

    The paper describes the delivery of the courses in the framework of the project implementation and presents the effect the change in the methodology had on student performance as measured by final grade. Methodology: University of Pristina piloted blended courses in 2013 under the framework of the Tempus BLATT project. The blended learning…

  11. Estimating breeding proportions and testing hypotheses about costs of reproduction with capture-recapture data

    USGS Publications Warehouse

    Nichols, James D.; Hines, James E.; Pollock, Kenneth H.; Hinz, Robert L.; Link, William A.

    1994-01-01

    The proportion of animals in a population that breeds is an important determinant of population growth rate. Usual estimates of this quantity from field sampling data assume that the probability of appearing in the capture or count statistic is the same for animals that do and do not breed. A similar assumption is required by most existing methods used to test ecologically interesting hypotheses about reproductive costs using field sampling data. However, in many field sampling situations breeding and nonbreeding animals are likely to exhibit different probabilities of being seen or caught. In this paper, we propose the use of multistate capture-recapture models for these estimation and testing problems. This methodology permits a formal test of the hypothesis of equal capture/sighting probabilities for breeding and nonbreeding individuals. Two estimators of breeding proportion (and associated standard errors) are presented, one for the case of equal capture probabilities and one for the case of unequal capture probabilities. The multistate modeling framework also yields formal tests of hypotheses about reproductive costs to future reproduction or survival or both fitness components. The general methodology is illustrated using capture-recapture data on female meadow voles, Microtus pennsylvanicus. Resulting estimates of the proportion of reproductively active females showed strong seasonal variation, as expected, with low breeding proportions in midwinter. We found no evidence of reproductive costs extracted in subsequent survival or reproduction. We believe that this methodological framework has wide application to problems in animal ecology concerning breeding proportions and phenotypic reproductive costs.

  12. How to Measure Costs and Benefits of eHealth Interventions: An Overview of Methods and Frameworks

    PubMed Central

    2015-01-01

    Information on the costs and benefits of eHealth interventions is needed, not only to document value for money and to support decision making in the field, but also to form the basis for developing business models and to facilitate payment systems to support large-scale services. In the absence of solid evidence of its effects, key decision makers may doubt the effectiveness, which, in turn, limits investment in, and the long-term integration of, eHealth services. However, it is not realistic to conduct economic evaluations of all eHealth applications and services in all situations, so we need to be able to generalize from those we do conduct. This implies that we have to select the most appropriate methodology and data collection strategy in order to increase the transferability across evaluations. This paper aims to contribute to the understanding of how to apply economic evaluation methodology in the eHealth field. It provides a brief overview of basic health economics principles and frameworks and discusses some methodological issues and challenges in conducting cost-effectiveness analysis of eHealth interventions. Issues regarding the identification, measurement, and valuation of costs and benefits are outlined. Furthermore, this work describes the established techniques of combining costs and benefits, presents the decision rules for identifying the preferred option, and outlines approaches to data collection strategies. Issues related to transferability and complexity are also discussed. PMID:26552360

  13. A general gridding, discretization, and coarsening methodology for modeling flow in porous formations with discrete geological features

    NASA Astrophysics Data System (ADS)

    Karimi-Fard, M.; Durlofsky, L. J.

    2016-10-01

    A comprehensive framework for modeling flow in porous media containing thin, discrete features, which could be high-permeability fractures or low-permeability deformation bands, is presented. The key steps of the methodology are mesh generation, fine-grid discretization, upscaling, and coarse-grid discretization. Our specialized gridding technique combines a set of intersecting triangulated surfaces by constructing approximate intersections using existing edges. This procedure creates a conforming mesh of all surfaces, which defines the internal boundaries for the volumetric mesh. The flow equations are discretized on this conforming fine mesh using an optimized two-point flux finite-volume approximation. The resulting discrete model is represented by a list of control-volumes with associated positions and pore-volumes, and a list of cell-to-cell connections with associated transmissibilities. Coarse models are then constructed by the aggregation of fine-grid cells, and the transmissibilities between adjacent coarse cells are obtained using flow-based upscaling procedures. Through appropriate computation of fracture-matrix transmissibilities, a dual-continuum representation is obtained on the coarse scale in regions with connected fracture networks. The fine and coarse discrete models generated within the framework are compatible with any connectivity-based simulator. The applicability of the methodology is illustrated for several two- and three-dimensional examples. In particular, we consider gas production from naturally fractured low-permeability formations, and transport through complex fracture networks. In all cases, highly accurate solutions are obtained with significant model reduction.

  14. Brain-Mind Operational Architectonics Imaging: Technical and Methodological Aspects

    PubMed Central

    Fingelkurts, Andrew A; Fingelkurts, Alexander A

    2008-01-01

    This review paper deals with methodological and technical foundations of the Operational Architectonics framework of brain and mind functioning. This theory provides a framework for mapping and understanding important aspects of the brain mechanisms that constitute perception, cognition, and eventually consciousness. The methods utilized within Operational Architectonics framework allow analyzing with an incredible detail the operational behavior of local neuronal assemblies and their joint activity in the form of unified and metastable operational modules, which constitute the whole hierarchy of brain operations, operations of cognition and phenomenal consciousness. PMID:19526071

  15. Cognitive training and plasticity: Theoretical perspective and methodological consequences

    PubMed Central

    Willis, Sherry L.; Schaie, K. Warner

    2013-01-01

    Purpose To provide an overview of cognitive plasticity concepts and findings from a lifespan developmental perspective. Methods After an evaluation of the general concept of cognitive plasticity, the most important approaches to study behavioral and brain plasticity are reviewed. This includes intervention studies, experimental approaches, cognitive trainings, the study of facilitating factors for strategy learning and strategy use, practice, and person-environment interactions. Transfer and durability of training-induced plasticity is discussed. Results The review indicates that methodological and conceptual advances are needed to improve the match between levels of behavioral and brain plasticity targeted in current developmental research and study designs. Conclusions The results suggest that the emphasis of plasticity studies on treatment effectiveness needs to be complemented by a strong commitment to the grounding of the intervention in a conceptual framework. PMID:19847065

  16. The coastal use structure within the coastal system. A sustainable development-consistent approach

    NASA Astrophysics Data System (ADS)

    Vallega, A.

    1996-01-01

    To contribute to the development of methodological approaches to coastal area management consistent with the sustainable development concept and guidelines provided by UNCED Agenda 21, Chapter 17, first the classifications of coastal uses provided by literature and those adopted by coastal management programmes are presented and discussed. Moving from this basis and reasoning in terms of general system-sustained approach the following concepts and methodological issues are considered: a goal-oriented concept of coastal use; the sustainable development-grounded coastal use framework and the role of discriminants through which it is conceived and described; the relationships between coastal uses; in particular, conflicting relationships focusing attention on conflicts between decision-making centres, as well as users, motivations and tractability of uses; the relationships between coastal uses and the ecosystem; the basic options for sustainability-consistent coastal use development.

  17. Pitting corrosion as a mixed system: coupled deterministic-probabilistic simulation of pit growth

    NASA Astrophysics Data System (ADS)

    Ibrahim, Israr B. M.; Fonna, S.; Pidaparti, R.

    2018-05-01

    Stochastic behavior of pitting corrosion poses a unique challenge in its computational analysis. However, it also stems from electrochemical activity causing general corrosion. In this paper, a framework for corrosion pit growth simulation based on the coupling of the Cellular Automaton (CA) and Boundary Element Methods (BEM) is presented. The framework assumes that pitting corrosion is controlled by electrochemical activity inside the pit cavity. The BEM provides the prediction of electrochemical activity given the geometrical data and polarization curves, while the CA is used to simulate the evolution of pit shapes based on electrochemical activity provided by BEM. To demonstrate the methodology, a sample case of local corrosion cells formed in pitting corrosion with varied dimensions and polarization functions is considered. Results show certain shapes tend to grow in certain types of environments. Some pit shapes appear to pose a higher risk by being potentially significant stress raisers or potentially increasing the rate of corrosion under the surface. Furthermore, these pits are comparable to commonly observed pit shapes in general corrosion environments.

  18. Micromechanics and effective elastoplastic behavior of two-phase metal matrix composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ju, J.W.; Chen, T.M.

    A micromechanical framework is presented to predict effective (overall) elasto-(visco-)plastic behavior of two-phase particle-reinforced metal matrix composites (PRMMC). In particular, the inclusion phase (particle) is assumed to be elastic and the matrix material is elasto-(visco-)plastic. Emanating from Ju and Chen's (1994a,b) work on effective elastic properties of composites containing many randomly dispersed inhomogeneities, effective elastoplastic deformations and responses of PRMMC are estimated by means of the effective yield criterion'' derived micromechanically by considering effects due to elastic particles embedded in the elastoplastic matrix. The matrix material is elastic or plastic, depending on local stress and deformation, and obeys general plasticmore » flow rule and hardening law. Arbitrary (general) loadings and unloadings are permitted in the framework through the elastic predictor-plastic corrector two-step operator splitting methodology. The proposed combined micromechanical and computational approach allows one to estimate overall elastoplastic responses of PRMMCs by accounting for the microstructural information (such as the spatial distribution and micro-geometry of particles), elastic properties of constituent phases, and the plastic behavior of the matrix-only materials.« less

  19. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    PubMed

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  20. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, suchmore » as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.« less

  1. Situated phenomenology and biological systems: Eastern and Western synthesis.

    PubMed

    Schroeder, Marcin J; Vallverdú, Jordi

    2015-12-01

    Phenomenology was born with the mission to give foundations for science of experience and to open consciousness to scientific study. The influence of phenomenology initiated in the works of Husserl and continued in a wide range of works of others was immense, but mainly within the confines of philosophy and the humanities. The actual attempts to develop a scientific discipline of the study of consciousness and to carry out research on cognition and consciousness were always based on the methods of traditional science in which elimination of the subjective has been always a primary tenet. Thus, focus was mainly on neurological correlates of conscious phenomena. The present paper is an attempt to initiate an extension and revision of phenomenological methodology with the use of philosophical and scientific experience and knowledge accumulated in a century of inquiry and research in relevant disciplines. The question which disciplines are relevant is crucial and our answer is innovative. The range of disciplines involved here is from information science and studies of computation, up to cultural psychology and the studies of philosophical traditions of the East. Concepts related to information and computation studies provide a general conceptual framework free from the limitations of particular languages and of linguistic analysis. This conceptual framework is extending the original perspective of phenomenology to issues of modern technology and science. Cultural psychology gives us tools to root out what in phenomenology was considered universal for humanity, but was a result of European ethnocentrism. Most important here is the contrast between individualistic and collectivistic cultural determinants of consciousness. Finally, philosophical tradition of the East gives alternatives in seeking solutions for fundamental problems. This general outline of the research methodology is illustrated by an example of its use when phenomenology is studied within the conceptual framework of information. Copyright © 2015. Published by Elsevier Ltd.

  2. Addressing location uncertainties in GPS-based activity monitoring: A methodological framework

    PubMed Central

    Wan, Neng; Lin, Ge; Wilson, Gaines J.

    2016-01-01

    Location uncertainty has been a major barrier in information mining from location data. Although the development of electronic and telecommunication equipment has led to an increased amount and refined resolution of data about individuals’ spatio-temporal trajectories, the potential of such data, especially in the context of environmental health studies, has not been fully realized due to the lack of methodology that addresses location uncertainties. This article describes a methodological framework for deriving information about people’s continuous activities from individual-collected Global Positioning System (GPS) data, which is vital for a variety of environmental health studies. This framework is composed of two major methods that address critical issues at different stages of GPS data processing: (1) a fuzzy classification method for distinguishing activity patterns; and (2) a scale-adaptive method for refining activity locations and outdoor/indoor environments. Evaluation of this framework based on smartphone-collected GPS data indicates that it is robust to location errors and is able to generate useful information about individuals’ life trajectories. PMID:28943777

  3. Biodiversity impact assessment (BIA+) - methodological framework for screening biodiversity.

    PubMed

    Winter, Lisa; Pflugmacher, Stephan; Berger, Markus; Finkbeiner, Matthias

    2018-03-01

    For the past 20 years, the life cycle assessment (LCA) community has sought to integrate impacts on biodiversity into the LCA framework. However, existing impact assessment methods still fail to do so comprehensively because they quantify only a few impacts related to specific species and regions. This paper proposes a methodological framework that will allow LCA practitioners to assess currently missing impacts on biodiversity on a global scale. Building on existing models that seek to quantify the impacts of human activities on biodiversity, the herein proposed methodological framework consists of 2 components: a habitat factor for 14 major habitat types and the impact on the biodiversity status in those major habitat types. The habitat factor is calculated by means of indicators that characterize each habitat. The biodiversity status depends on parameters from impact categories. The impact functions, relating these different parameters to a given response in the biodiversity status, rely on expert judgments. To ensure the applicability for LCA practitioners, the components of the framework can be regionalized on a country scale for which LCA inventory data is more readily available. The weighting factors for the 14 major habitat types range from 0.63 to 1.82. By means of area weighting of the major habitat types in a country, country-specific weighting factors are calculated. In order to demonstrate the main part of the framework, examples of impact functions are given for the categories "freshwater eutrophication" and "freshwater ecotoxicity" in 1 major habitat type. The results confirm suitability of the methodological framework. The major advantages are the framework's user-friendliness, given that data can be used from LCA databases directly, and the complete inclusion of all levels of biodiversity (genetic, species, and ecosystem). It is applicable for the whole world and a wide range of impact categories. Integr Environ Assess Manag 2018;14:282-297. © 2017 SETAC. © 2017 SETAC.

  4. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  5. A dynamic systems approach to psychotherapy: A meta-theoretical framework for explaining psychotherapy change processes.

    PubMed

    Gelo, Omar Carlo Gioacchino; Salvatore, Sergio

    2016-07-01

    Notwithstanding the many methodological advances made in the field of psychotherapy research, at present a metatheoretical, school-independent framework to explain psychotherapy change processes taking into account their dynamic and complex nature is still lacking. Over the last years, several authors have suggested that a dynamic systems (DS) approach might provide such a framework. In the present paper, we review the main characteristics of a DS approach to psychotherapy. After an overview of the general principles of the DS approach, we describe the extent to which psychotherapy can be considered as a self-organizing open complex system, whose developmental change processes are described in terms of a dialectic dynamics between stability and change over time. Empirical evidence in support of this conceptualization is provided and discussed. Finally, we propose a research design strategy for the empirical investigation of psychotherapy from a DS approach, together with a research case example. We conclude that a DS approach may provide a metatheoretical, school-independent framework allowing us to constructively rethink and enhance the way we conceptualize and empirically investigate psychotherapy. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Heat-Passing Framework for Robust Interpretation of Data in Networks

    PubMed Central

    Fang, Yi; Sun, Mengtian; Ramani, Karthik

    2015-01-01

    Researchers are regularly interested in interpreting the multipartite structure of data entities according to their functional relationships. Data is often heterogeneous with intricately hidden inner structure. With limited prior knowledge, researchers are likely to confront the problem of transforming this data into knowledge. We develop a new framework, called heat-passing, which exploits intrinsic similarity relationships within noisy and incomplete raw data, and constructs a meaningful map of the data. The proposed framework is able to rank, cluster, and visualize the data all at once. The novelty of this framework is derived from an analogy between the process of data interpretation and that of heat transfer, in which all data points contribute simultaneously and globally to reveal intrinsic similarities between regions of data, meaningful coordinates for embedding the data, and exemplar data points that lie at optimal positions for heat transfer. We demonstrate the effectiveness of the heat-passing framework for robustly partitioning the complex networks, analyzing the globin family of proteins and determining conformational states of macromolecules in the presence of high levels of noise. The results indicate that the methodology is able to reveal functionally consistent relationships in a robust fashion with no reference to prior knowledge. The heat-passing framework is very general and has the potential for applications to a broad range of research fields, for example, biological networks, social networks and semantic analysis of documents. PMID:25668316

  7. A Framework for Evaluating and Enhancing Alignment in Self-Regulated Learning Research

    PubMed Central

    Dent, Amy L.; Hoyle, Rick H.

    2015-01-01

    We discuss the articles of this special issue with reference to an important yet previously only implicit dimension of study quality: alignment across the theoretical and methodological decisions that collectively define an approach to self-regulated learning. Integrating and extending work by leaders in the field, we propose a framework for evaluating alignment in the way self-regulated learning research is both conducted and reported. Within this framework, the special issue articles provide a springboard for discussing methodological promises and pitfalls of increasingly sophisticated research on the dynamic, contingent, and contextualized features of self-regulated learning. PMID:25825589

  8. A Novel Framework for Identifying the Interactions between Biophysical and Social Components of an Agricultural System: A Guide for Improving Wheat Production in Haryana, NW India

    ERIC Educational Resources Information Center

    Coventry, D. R.; Poswal, R. S.; Yadav, Ashok; Zhou, Yi; Riar, Amritbir; Kumar, Anuj; Sharma, R. K.; Chhokar, R. S.; Gupta, R. K.; Mehta, A. K.; Chand, Ramesh; Denton, M. D.; Cummins, J. A.

    2018-01-01

    Purpose: The purpose of this study is to develop a conceptual framework with related analysis methodologies that identifies the influence of social environment on an established cropping system. Design/Methodology/Approach: A stratified survey including 103 villages and 823 farmers was conducted in all districts of Haryana (India). Firstly,…

  9. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  10. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  11. A Statistical Framework for Analyzing Cyber Threats

    DTIC Science & Technology

    defender cares most about the attacks against certain ports or services). The grey-box statistical framework formulates a new methodology of Cybersecurity ...the design of prediction models. Our research showed that the grey-box framework is effective in predicting cybersecurity situational awareness.

  12. Flavoured tobacco products and the public's health: lessons from the TPSAC menthol report.

    PubMed

    Samet, Jonathan M; Pentz, Mary Ann; Unger, Jennifer B

    2016-11-01

    The menthol report developed by the Tobacco Products Scientific Advisory Committee (TPSAC) of the Center for Tobacco Products elaborated a methodology for considering the public health impact of menthol in cigarettes that has relevance to flavourings generally. The TPSAC report was based on a conceptual framework on how menthol in cigarettes has public health impact results of evidence from related systematic reviews, and an evidence-based statistical model. In extending this approach to flavourings generally, consideration will need to be given to the existence of multiple flavourings, a very dynamic market place and regulatory interventions and industry activities. Now is the time to begin to develop the research strategies and models needed to extend the TPSAC approach to flavoured tobacco products generally. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. [Review of the gender research in cross-cultural psychology since 1990: conceptual definitions and methodology].

    PubMed

    Suzuki, Atsuko

    2004-06-01

    A review of the cross-cultural research on gender in psychology since 1990 reveals (1) conceptual confusion of the definitions of sex, gender, man, and woman; (2) diversification, refinement, reification, and a problem-solving orientation in the research topics; and (3) the possibility of the elucidation of the psychological sex-difference mechanism in relation to the biological sex differences. A comparison of 1990 and 2000 cross-cultural psychological articles published in "Sex Roles" found that overall, the research is Western-centered and some methodological problems remain to be solved concerning the measures and the sampling. These findings lead to the following suggestions for cross-cultural research on gender to resolve the problems and contribute to the development of psychology in general: (1) use of an operational definition for conceptual equivalence; (2) conducting more etic-approach research; (3) avoiding ethnocentric or androcentric research attitudes; (4) use of a theoretical framework; (5) strict examination of methodologies; and (6) examination of the specific context of participants in terms of cultural diversity, dynamics of husband-wife relationships, and relationships with husbands and fathers.

  14. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  15. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  16. CNN based approach for activity recognition using a wrist-worn accelerometer.

    PubMed

    Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R

    2017-07-01

    In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.

  17. Ultrasensitivity in signaling cascades revisited: Linking local and global ultrasensitivity estimations.

    PubMed

    Altszyler, Edgar; Ventura, Alejandra C; Colman-Lerner, Alejandro; Chernomoretz, Ariel

    2017-01-01

    Ultrasensitive response motifs, capable of converting graded stimuli into binary responses, are well-conserved in signal transduction networks. Although it has been shown that a cascade arrangement of multiple ultrasensitive modules can enhance the system's ultrasensitivity, how a given combination of layers affects a cascade's ultrasensitivity remains an open question for the general case. Here, we introduce a methodology that allows us to determine the presence of sequestration effects and to quantify the relative contribution of each module to the overall cascade's ultrasensitivity. The proposed analysis framework provides a natural link between global and local ultrasensitivity descriptors and it is particularly well-suited to characterize and understand mathematical models used to study real biological systems. As a case study, we have considered three mathematical models introduced by O'Shaughnessy et al. to study a tunable synthetic MAPK cascade, and we show how our methodology can help modelers better understand alternative models.

  18. Structural equation modeling: building and evaluating causal models: Chapter 8

    USGS Publications Warehouse

    Grace, James B.; Scheiner, Samuel M.; Schoolmaster, Donald R.

    2015-01-01

    Scientists frequently wish to study hypotheses about causal relationships, rather than just statistical associations. This chapter addresses the question of how scientists might approach this ambitious task. Here we describe structural equation modeling (SEM), a general modeling framework for the study of causal hypotheses. Our goals are to (a) concisely describe the methodology, (b) illustrate its utility for investigating ecological systems, and (c) provide guidance for its application. Throughout our presentation, we rely on a study of the effects of human activities on wetland ecosystems to make our description of methodology more tangible. We begin by presenting the fundamental principles of SEM, including both its distinguishing characteristics and the requirements for modeling hypotheses about causal networks. We then illustrate SEM procedures and offer guidelines for conducting SEM analyses. Our focus in this presentation is on basic modeling objectives and core techniques. Pointers to additional modeling options are also given.

  19. USE Efficiency: an innovative educational programme for energy efficiency in buildings

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Theofilos A.; Christoforidis, Georgios C.; Papagiannis, Grigoris K.

    2017-10-01

    Power engineers are expected to play a pivotal role in transforming buildings into smart and energy-efficient structures, which is necessary since buildings are responsible for a considerable amount of the total energy consumption. To fulfil this role, a holistic approach in education is required, tackling subjects traditionally related to other engineering disciplines. In this context, USE Efficiency is an inter-institutional and interdisciplinary educational programme implemented in nine European Universities targeting energy efficiency in buildings. The educational programme effectively links professors, students, engineers and industry experts, creating a unique learning environment. The scope of the paper is to present the methodology and the general framework followed in the USE Efficiency programme. The proposed methodology can be adopted for the design and implementation of educational programmes on energy efficiency and sustainable development in higher education. End-of-course survey results showed positive feedback from the participating students, indicating the success of the programme.

  20. Ultrasensitivity in signaling cascades revisited: Linking local and global ultrasensitivity estimations

    PubMed Central

    Altszyler, Edgar; Ventura, Alejandra C.; Colman-Lerner, Alejandro; Chernomoretz, Ariel

    2017-01-01

    Ultrasensitive response motifs, capable of converting graded stimuli into binary responses, are well-conserved in signal transduction networks. Although it has been shown that a cascade arrangement of multiple ultrasensitive modules can enhance the system’s ultrasensitivity, how a given combination of layers affects a cascade’s ultrasensitivity remains an open question for the general case. Here, we introduce a methodology that allows us to determine the presence of sequestration effects and to quantify the relative contribution of each module to the overall cascade’s ultrasensitivity. The proposed analysis framework provides a natural link between global and local ultrasensitivity descriptors and it is particularly well-suited to characterize and understand mathematical models used to study real biological systems. As a case study, we have considered three mathematical models introduced by O’Shaughnessy et al. to study a tunable synthetic MAPK cascade, and we show how our methodology can help modelers better understand alternative models. PMID:28662096

  1. Toward an Ecological Risk Assessment Framework for Special Education

    ERIC Educational Resources Information Center

    Trepanier, Nathalie Sonia

    2005-01-01

    This paper suggests a new framework for conducting research in the field of special education. This framework is inspired by the ecological risk assessment frameworks of the U.S. Environmental Protection Agency (1995) and G.W. Suter (1993), which are primarily used in ecotoxicology and environmental toxicology. The methodology used to develop the…

  2. Examining the effectiveness of municipal solid waste management systems: An integrated cost-benefit analysis perspective with a financial cost modeling in Taiwan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weng, Yu-Chi, E-mail: clyde.weng@gmail.com; Fujiwara, Takeshi

    2011-06-15

    In order to develop a sound material-cycle society, cost-effective municipal solid waste (MSW) management systems are required for the municipalities in the context of the integrated accounting system for MSW management. Firstly, this paper attempts to establish an integrated cost-benefit analysis (CBA) framework for evaluating the effectiveness of MSW management systems. In this paper, detailed cost/benefit items due to waste problems are particularly clarified. The stakeholders of MSW management systems, including the decision-makers of the municipalities and the citizens, are expected to reconsider the waste problems in depth and thus take wise actions with the aid of the proposed CBAmore » framework. Secondly, focusing on the financial cost, this study develops a generalized methodology to evaluate the financial cost-effectiveness of MSW management systems, simultaneously considering the treatment technological levels and policy effects. The impacts of the influencing factors on the annual total and average financial MSW operation and maintenance (O and M) costs are analyzed in the Taiwanese case study with a demonstrative short-term future projection of the financial costs under scenario analysis. The established methodology would contribute to the evaluation of the current policy measures and to the modification of the policy design for the municipalities.« less

  3. Unsupervised Detection of Planetary Craters by a Marked Point Process

    NASA Technical Reports Server (NTRS)

    Troglio, G.; Benediktsson, J. A.; Le Moigne, J.; Moser, G.; Serpico, S. B.

    2011-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images is being acquired. Preferably, automatic and robust processing techniques need to be used for data analysis because of the huge amount of the acquired data. Here, the aim is to achieve a robust and general methodology for crater detection. A novel technique based on a marked point process is proposed. First, the contours in the image are extracted. The object boundaries are modeled as a configuration of an unknown number of random ellipses, i.e., the contour image is considered as a realization of a marked point process. Then, an energy function is defined, containing both an a priori energy and a likelihood term. The global minimum of this function is estimated by using reversible jump Monte-Carlo Markov chain dynamics and a simulated annealing scheme. The main idea behind marked point processes is to model objects within a stochastic framework: Marked point processes represent a very promising current approach in the stochastic image modeling and provide a powerful and methodologically rigorous framework to efficiently map and detect objects and structures in an image with an excellent robustness to noise. The proposed method for crater detection has several feasible applications. One such application area is image registration by matching the extracted features.

  4. A review of methodological factors in performance assessments of time-varying aircraft noise effects. [with annotated bibliography

    NASA Technical Reports Server (NTRS)

    Coates, G. D.; Alluisi, E. A.; Adkins, C. J., Jr.

    1977-01-01

    Literature on the effects of general noise on human performance is reviewed in an attempt to identify (1) those characteristics of noise that have been found to affect human performance; (2) those characteristics of performance most likely to be affected by the presence of noise, and (3) those characteristics of the performance situation typically associated with noise effects. Based on the characteristics identified, a theoretical framework is proposed that will permit predictions of possible effects of time-varying aircraft-type noise on complex human performance. An annotated bibliography of 50 articles is included.

  5. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn; Lin, Guang, E-mail: guanglin@purdue.edu

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  6. RRegrs: an R package for computer-aided model selection with multiple regression models.

    PubMed

    Tsiliki, Georgia; Munteanu, Cristian R; Seoane, Jose A; Fernandez-Lozano, Carlos; Sarimveis, Haralambos; Willighagen, Egon L

    2015-01-01

    Predictive regression models can be created with many different modelling approaches. Choices need to be made for data set splitting, cross-validation methods, specific regression parameters and best model criteria, as they all affect the accuracy and efficiency of the produced predictive models, and therefore, raising model reproducibility and comparison issues. Cheminformatics and bioinformatics are extensively using predictive modelling and exhibit a need for standardization of these methodologies in order to assist model selection and speed up the process of predictive model development. A tool accessible to all users, irrespectively of their statistical knowledge, would be valuable if it tests several simple and complex regression models and validation schemes, produce unified reports, and offer the option to be integrated into more extensive studies. Additionally, such methodology should be implemented as a free programming package, in order to be continuously adapted and redistributed by others. We propose an integrated framework for creating multiple regression models, called RRegrs. The tool offers the option of ten simple and complex regression methods combined with repeated 10-fold and leave-one-out cross-validation. Methods include Multiple Linear regression, Generalized Linear Model with Stepwise Feature Selection, Partial Least Squares regression, Lasso regression, and Support Vector Machines Recursive Feature Elimination. The new framework is an automated fully validated procedure which produces standardized reports to quickly oversee the impact of choices in modelling algorithms and assess the model and cross-validation results. The methodology was implemented as an open source R package, available at https://www.github.com/enanomapper/RRegrs, by reusing and extending on the caret package. The universality of the new methodology is demonstrated using five standard data sets from different scientific fields. Its efficiency in cheminformatics and QSAR modelling is shown with three use cases: proteomics data for surface-modified gold nanoparticles, nano-metal oxides descriptor data, and molecular descriptors for acute aquatic toxicity data. The results show that for all data sets RRegrs reports models with equal or better performance for both training and test sets than those reported in the original publications. Its good performance as well as its adaptability in terms of parameter optimization could make RRegrs a popular framework to assist the initial exploration of predictive models, and with that, the design of more comprehensive in silico screening applications.Graphical abstractRRegrs is a computer-aided model selection framework for R multiple regression models; this is a fully validated procedure with application to QSAR modelling.

  7. Diverse Ways to Fore-Ground Methodological Insights about Qualitative Research

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka; Mazzei, Lisa A.; Ceglowski, Deborah

    2013-01-01

    Texts and articles that put epistemological theories and methodologies to work in the context of qualitative research can stimulate scholarship in various ways such as through methodological innovations, transferability of theories and methods, interdisciplinarity, and transformative reflections across traditions and frameworks. Such…

  8. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  9. Cost-Effectiveness of HBV and HCV Screening Strategies – A Systematic Review of Existing Modelling Techniques

    PubMed Central

    Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David

    2015-01-01

    Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908

  10. A novel methodology for strengthening human rights based monitoring in public health: Family planning indicators as an illustrative example.

    PubMed

    Gruskin, Sofia; Ferguson, Laura; Kumar, Shubha; Nicholson, Alexandra; Ali, Moazzam; Khosla, Rajat

    2017-01-01

    The last few years have seen a rise in the number of global and national initiatives that seek to incorporate human rights into public health practice. Nonetheless, a lack of clarity persists regarding the most appropriate indicators to monitor rights concerns in these efforts. The objective of this work was to develop a systematic methodology for use in determining the extent to which indicators commonly used in public health capture human rights concerns, using contraceptive services and programmes as a case study. The approach used to identify, evaluate, select and review indicators for their human rights sensitivity built on processes undertaken in previous work led by the World Health Organization (WHO). With advice from an expert advisory group, an analytic framework was developed to identify and evaluate quantitative, qualitative, and policy indicators in relation to contraception for their sensitivity to human rights. To test the framework's validity, indicators were reviewed to determine their feasibility to provide human rights analysis with attention to specific rights principles and standards. This exercise resulted in the identification of indicators that could be used to monitor human rights concerns as well as key gaps where additional indicators are required. While indicators generally used to monitor contraception programmes have some degree of sensitivity to human rights, breadth and depth are lacking. The proposed methodology can be useful to practitioners, researchers, and policy makers working in any area of health who are interested in monitoring and evaluating attention to human rights in commonly used health indicators.

  11. Tomographic imaging of non-local media based on space-fractional diffusion models

    NASA Astrophysics Data System (ADS)

    Buonocore, Salvatore; Semperlotti, Fabio

    2018-06-01

    We investigate a generalized tomographic imaging framework applicable to a class of inhomogeneous media characterized by non-local diffusive energy transport. Under these conditions, the transport mechanism is well described by fractional-order continuum models capable of capturing anomalous diffusion that would otherwise remain undetected when using traditional integer-order models. Although the underlying idea of the proposed framework is applicable to any transport mechanism, the case of fractional heat conduction is presented as a specific example to illustrate the methodology. By using numerical simulations, we show how complex inhomogeneous media involving non-local transport can be successfully imaged if fractional order models are used. In particular, results will show that by properly recognizing and accounting for the fractional character of the host medium not only allows achieving increased resolution but, in case of strong and spatially distributed non-locality, it represents the only viable approach to achieve a successful reconstruction.

  12. A new approach to the concept of "relevance" in information retrieval (IR).

    PubMed

    Kagolovsky, Y; Möhr, J R

    2001-01-01

    The concept of "relevance" is the fundamental concept of information science in general and information retrieval, in particular. Although "relevance" is extensively used in evaluation of information retrieval, there are considerable problems associated with reaching an agreement on its definition, meaning, evaluation, and application in information retrieval. There are a number of different views on "relevance" and its use for evaluation. Based on a review of the literature the main problems associated with the concept of "relevance" in information retrieval are identified. The authors argue that the proposal for the solution of the problems can be based on the conceptual IR framework built using a systems analytic approach to IR. Using this framework different kinds of "relevance" relationships in the IR process are identified, and a methodology for evaluation of "relevance" based on methods of semantics capturing and comparison is proposed.

  13. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  14. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  15. Translating Behavioral Science into Practice: A Framework to Determine Science Quality and Applicability for Police Organizations.

    PubMed

    McClure, Kimberley A; McGuire, Katherine L; Chapan, Denis M

    2018-05-07

    Policy on officer-involved shootings is critically reviewed and errors in applying scientific knowledge identified. Identifying and evaluating the most relevant science to a field-based problem is challenging. Law enforcement administrators with a clear understanding of valid science and application are in a better position to utilize scientific knowledge for the benefit of their organizations and officers. A recommended framework is proposed for considering the validity of science and its application. Valid science emerges via hypothesis testing, replication, extension and marked by peer review, known error rates, and general acceptance in its field of origin. Valid application of behavioral science requires an understanding of the methodology employed, measures used, and participants recruited to determine whether the science is ready for application. Fostering a science-practitioner partnership and an organizational culture that embraces quality, empirically based policy, and practices improves science-to-practice translation. © 2018 American Academy of Forensic Sciences.

  16. A user exposure based approach for non-structural road network vulnerability analysis

    PubMed Central

    Jin, Lei; Wang, Haizhong; Yu, Le; Liu, Lin

    2017-01-01

    Aiming at the dense urban road network vulnerability without structural negative consequences, this paper proposes a novel non-structural road network vulnerability analysis framework. Three aspects of the framework are mainly described: (i) the rationality of non-structural road network vulnerability, (ii) the metrics for negative consequences accounting for variant road conditions, and (iii) the introduction of a new vulnerability index based on user exposure. Based on the proposed methodology, a case study in the Sioux Falls network which was usually threatened by regular heavy snow during wintertime is detailedly discussed. The vulnerability ranking of links of Sioux Falls network with respect to heavy snow scenario is identified. As a result of non-structural consequences accompanied by conceivable degeneration of network, there are significant increases in generalized travel time costs which are measurements for “emotionally hurt” of topological road network. PMID:29176832

  17. An Interoperability Framework and Capability Profiling for Manufacturing Software

    NASA Astrophysics Data System (ADS)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  18. When Playing Meets Learning: Methodological Framework for Designing Educational Games

    NASA Astrophysics Data System (ADS)

    Linek, Stephanie B.; Schwarz, Daniel; Bopp, Matthias; Albert, Dietrich

    Game-based learning builds upon the idea of using the motivational potential of video games in the educational context. Thus, the design of educational games has to address optimizing enjoyment as well as optimizing learning. Within the EC-project ELEKTRA a methodological framework for the conceptual design of educational games was developed. Thereby state-of-the-art psycho-pedagogical approaches were combined with insights of media-psychology as well as with best-practice game design. This science-based interdisciplinary approach was enriched by enclosed empirical research to answer open questions on educational game-design. Additionally, several evaluation-cycles were implemented to achieve further improvements. The psycho-pedagogical core of the methodology can be summarized by the ELEKTRA's 4Ms: Macroadaptivity, Microadaptivity, Metacognition, and Motivation. The conceptual framework is structured in eight phases which have several interconnections and feedback-cycles that enable a close interdisciplinary collaboration between game design, pedagogy, cognitive science and media psychology.

  19. Composite generalized Langevin equation for Brownian motion in different hydrodynamic and adhesion regimes.

    PubMed

    Yu, Hsiu-Yu; Eckmann, David M; Ayyaswamy, Portonovo S; Radhakrishnan, Ravi

    2015-05-01

    We present a composite generalized Langevin equation as a unified framework for bridging the hydrodynamic, Brownian, and adhesive spring forces associated with a nanoparticle at different positions from a wall, namely, a bulklike regime, a near-wall regime, and a lubrication regime. The particle velocity autocorrelation function dictates the dynamical interplay between the aforementioned forces, and our proposed methodology successfully captures the well-known hydrodynamic long-time tail with context-dependent scaling exponents and oscillatory behavior due to the binding interaction. Employing the reactive flux formalism, we analyze the effect of hydrodynamic variables on the particle trajectory and characterize the transient kinetics of a particle crossing a predefined milestone. The results suggest that both wall-hydrodynamic interactions and adhesion strength impact the particle kinetics.

  20. Confidence intervals for differences between volumes under receiver operating characteristic surfaces (VUS) and generalized Youden indices (GYIs).

    PubMed

    Yin, Jingjing; Nakas, Christos T; Tian, Lili; Reiser, Benjamin

    2018-03-01

    This article explores both existing and new methods for the construction of confidence intervals for differences of indices of diagnostic accuracy of competing pairs of biomarkers in three-class classification problems and fills the methodological gaps for both parametric and non-parametric approaches in the receiver operating characteristic surface framework. The most widely used such indices are the volume under the receiver operating characteristic surface and the generalized Youden index. We describe implementation of all methods and offer insight regarding the appropriateness of their use through a large simulation study with different distributional and sample size scenarios. Methods are illustrated using data from the Alzheimer's Disease Neuroimaging Initiative study, where assessment of cognitive function naturally results in a three-class classification setting.

  1. ProFUSO: Business process and ontology-based framework to develop ubiquitous computing support systems for chronic patients' management.

    PubMed

    Jimenez-Molina, Angel; Gaete-Villegas, Jorge; Fuentes, Javier

    2018-06-01

    New advances in telemedicine, ubiquitous computing, and artificial intelligence have supported the emergence of more advanced applications and support systems for chronic patients. This trend addresses the important problem of chronic illnesses, highlighted by multiple international organizations as a core issue in future healthcare. Despite the myriad of exciting new developments, each application and system is designed and implemented for specific purposes and lacks the flexibility to support different healthcare concerns. Some of the known problems of such developments are the integration issues between applications and existing healthcare systems, the reusability of technical knowledge in the creation of new and more sophisticated systems and the usage of data gathered from multiple sources in the generation of new knowledge. This paper proposes a framework for the development of chronic disease support systems and applications as an answer to these shortcomings. Through this framework our pursuit is to create a common ground methodology upon which new developments can be created and easily integrated to provide better support to chronic patients, medical staff and other relevant participants. General requirements are inferred for any support system from the primary attention process of chronic patients by the Business Process Management Notation. Numerous technical approaches are proposed to design a general architecture that considers the medical organizational requirements in the treatment of a patient. A framework is presented for any application in support of chronic patients and evaluated by a case study to test the applicability and pertinence of the solution. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. A Framework for Developing the Structure of Public Health Economic Models.

    PubMed

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Suicide Risk Assessment Training for Psychology Doctoral Programs: Core Competencies and a Framework for Training

    PubMed Central

    Cramer, Robert J.; Johnson, Shara M.; McLaughlin, Jennifer; Rausch, Emilie M.; Conroy, Mary Alice

    2014-01-01

    Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are reviewed. Second, as an extension of these methodological training issues, we integrate empirically- and expert-derived suicide risk assessment competencies from several sources with the goal of streamlining core competencies for training purposes. Finally, a framework for suicide risk assessment training is outlined. The approach employs Objective Structured Clinical Examination (OSCE) methodology, an approach commonly utilized in medical competency training. The training modality also proposes the Suicide Competency Assessment Form (SCAF), a training tool evaluating self- and observer-ratings of trainee core competencies. The training framework and SCAF are ripe for empirical evaluation and potential training implementation. PMID:24672588

  4. Suicide Risk Assessment Training for Psychology Doctoral Programs: Core Competencies and a Framework for Training.

    PubMed

    Cramer, Robert J; Johnson, Shara M; McLaughlin, Jennifer; Rausch, Emilie M; Conroy, Mary Alice

    2013-02-01

    Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are reviewed. Second, as an extension of these methodological training issues, we integrate empirically- and expert-derived suicide risk assessment competencies from several sources with the goal of streamlining core competencies for training purposes. Finally, a framework for suicide risk assessment training is outlined. The approach employs Objective Structured Clinical Examination (OSCE) methodology, an approach commonly utilized in medical competency training. The training modality also proposes the Suicide Competency Assessment Form (SCAF), a training tool evaluating self- and observer-ratings of trainee core competencies. The training framework and SCAF are ripe for empirical evaluation and potential training implementation.

  5. Development of a theoretical framework for analyzing cerebrospinal fluid dynamics

    PubMed Central

    Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy

    2009-01-01

    Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652

  6. Evaluation of environmental aspects significance in ISO 14001.

    PubMed

    Põder, Tõnis

    2006-05-01

    The methodological framework set by standards ISO 14001 and ISO 14004 gives only general principles for environmental aspects assessment, which is regarded as one of the most critical stages of implementing environmental management system. In Estonia, about 100 organizations have been certified to the ISO 14001. Experience obtained from numerous companies has demonstrated that limited transparency and reproducibility of the assessment process serves as a common shortcoming. Despite rather complicated assessment schemes sometimes used, the evaluation procedures have been largely based on subjective judgments because of ill-defined and inadequate assessment criteria. A comparison with some similar studies in other countries indicates a general nature of observed inconsistencies. The diversity of approaches to the aspects' assessment in concept literature and to the related problems has been discussed. The general structure of basic assessment criteria, compatible with environmental impact assessment and environmental risk analysis has also been outlined. Based on this general structure, the article presents a tiered approach to help organize the assessment in a more consistent manner.

  7. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Teaching Sustainability to Business Students: Shifting Mindsets

    ERIC Educational Resources Information Center

    Stubbs, Wendy; Cocklin, Chris

    2008-01-01

    Purpose: This paper seeks to describe a framework used to help MBA students understand and reconcile the different sustainability perspectives. Design/methodology/approach: A review of the corporate sustainability literature is undertaken to develop the sustainability framework. Findings: The sustainability framework relates basic concepts and…

  9. A step-by-step methodology for enterprise interoperability projects

    NASA Astrophysics Data System (ADS)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  10. Value Frameworks in Oncology: Comparative Analysis and Implications to the Pharmaceutical Industry.

    PubMed

    Slomiany, Mark; Madhavan, Priya; Kuehn, Michael; Richardson, Sasha

    2017-07-01

    As the cost of oncology care continues to rise, composite value models that variably capture the diverse concerns of patients, physicians, payers, policymakers, and the pharmaceutical industry have begun to take shape. To review the capabilities and limitations of 5 of the most notable value frameworks in oncology that have emerged in recent years and to compare their relative value and application among the intended stakeholders. We compared the methodology of the American Society of Clinical Oncology (ASCO) Value Framework (version 2.0), the National Comprehensive Cancer Network Evidence Blocks, Memorial Sloan Kettering Cancer Center DrugAbacus, the Institute for Clinical and Economic Review Value Assessment Framework, and the European Society for Medical Oncology Magnitude of Clinical Benefit Scale, using a side-by-side comparative approach in terms of the input, scoring methodology, and output of each framework. In addition, we gleaned stakeholder insights about these frameworks and their potential real-world applications through dialogues with physicians and payers, as well as through secondary research and an aggregate analysis of previously published survey results. The analysis identified several framework-specific themes in their respective focus on clinical trial elements, breadth of evidence, evidence weighting, scoring methodology, and value to stakeholders. Our dialogues with physicians and our aggregate analysis of previous surveys revealed a varying level of awareness of, and use of, each of the value frameworks in clinical practice. For example, although the ASCO Value Framework appears nascent in clinical practice, physicians believe that the frameworks will be more useful in practice in the future as they become more established and as their outputs are more widely accepted. Along with patients and payers, who bear the burden of treatment costs, physicians and policymakers have waded into the discussion of defining value in oncology care, as well as pharmaceutical companies that seek to understand the impact of these value frameworks on each stakeholder, as they model the value and financial threshold of innovative, high-cost drugs.

  11. Benchmarking initiatives in the water industry.

    PubMed

    Parena, R; Smeets, E

    2001-01-01

    Customer satisfaction and service care are every day pushing professionals in the water industry to seek to improve their performance, lowering costs and increasing the provided service level. Process Benchmarking is generally recognised as a systematic mechanism of comparing one's own utility with other utilities or businesses with the intent of self-improvement by adopting structures or methods used elsewhere. The IWA Task Force on Benchmarking, operating inside the Statistics and Economics Committee, has been committed to developing a general accepted concept of Process Benchmarking to support water decision-makers in addressing issues of efficiency. In a first step the Task Force disseminated among the Committee members a questionnaire focused on providing suggestions about the kind, the evolution degree and the main concepts of Benchmarking adopted in the represented Countries. A comparison among the guidelines adopted in The Netherlands and Scandinavia has recently challenged the Task Force in drafting a methodology for a worldwide process benchmarking in water industry. The paper provides a framework of the most interesting benchmarking experiences in the water sector and describes in detail both the final results of the survey and the methodology focused on identification of possible improvement areas.

  12. Analysis of in vitro fertilization data with multiple outcomes using discrete time-to-event analysis

    PubMed Central

    Maity, Arnab; Williams, Paige; Ryan, Louise; Missmer, Stacey; Coull, Brent; Hauser, Russ

    2014-01-01

    In vitro fertilization (IVF) is an increasingly common method of assisted reproductive technology. Because of the careful observation and followup required as part of the procedure, IVF studies provide an ideal opportunity to identify and assess clinical and demographic factors along with environmental exposures that may impact successful reproduction. A major challenge in analyzing data from IVF studies is handling the complexity and multiplicity of outcome, resulting from both multiple opportunities for pregnancy loss within a single IVF cycle in addition to multiple IVF cycles. To date, most evaluations of IVF studies do not make use of full data due to its complex structure. In this paper, we develop statistical methodology for analysis of IVF data with multiple cycles and possibly multiple failure types observed for each individual. We develop a general analysis framework based on a generalized linear modeling formulation that allows implementation of various types of models including shared frailty models, failure specific frailty models, and transitional models, using standard software. We apply our methodology to data from an IVF study conducted at the Brigham and Women’s Hospital, Massachusetts. We also summarize the performance of our proposed methods based on a simulation study. PMID:24317880

  13. General methodology for nonlinear modeling of neural systems with Poisson point-process inputs.

    PubMed

    Marmarelis, V Z; Berger, T W

    2005-07-01

    This paper presents a general methodological framework for the practical modeling of neural systems with point-process inputs (sequences of action potentials or, more broadly, identical events) based on the Volterra and Wiener theories of functional expansions and system identification. The paper clarifies the distinctions between Volterra and Wiener kernels obtained from Poisson point-process inputs. It shows that only the Wiener kernels can be estimated via cross-correlation, but must be defined as zero along the diagonals. The Volterra kernels can be estimated far more accurately (and from shorter data-records) by use of the Laguerre expansion technique adapted to point-process inputs, and they are independent of the mean rate of stimulation (unlike their P-W counterparts that depend on it). The Volterra kernels can also be estimated for broadband point-process inputs that are not Poisson. Useful applications of this modeling approach include cases where we seek to determine (model) the transfer characteristics between one neuronal axon (a point-process 'input') and another axon (a point-process 'output') or some other measure of neuronal activity (a continuous 'output', such as population activity) with which a causal link exists.

  14. Envelope analysis of rotating machine vibrations in variable speed conditions: A comprehensive treatment

    NASA Astrophysics Data System (ADS)

    Abboud, D.; Antoni, J.; Sieg-Zieba, S.; Eltabach, M.

    2017-02-01

    Nowadays, the vibration analysis of rotating machine signals is a well-established methodology, rooted on powerful tools offered, in particular, by the theory of cyclostationary (CS) processes. Among them, the squared envelope spectrum (SES) is probably the most popular to detect random CS components which are typical symptoms, for instance, of rolling element bearing faults. Recent researches are shifted towards the extension of existing CS tools - originally devised in constant speed conditions - to the case of variable speed conditions. Many of these works combine the SES with computed order tracking after some preprocessing steps. The principal object of this paper is to organize these dispersed researches into a structured comprehensive framework. Three original features are furnished. First, a model of rotating machine signals is introduced which sheds light on the various components to be expected in the SES. Second, a critical comparison is made of three sophisticated methods, namely, the improved synchronous average, the cepstrum prewhitening, and the generalized synchronous average, used for suppressing the deterministic part. Also, a general envelope enhancement methodology which combines the latter two techniques with a time-domain filtering operation is revisited. All theoretical findings are experimentally validated on simulated and real-world vibration signals.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandor, Debra; Chung, Donald; Keyser, David

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  16. Solid Waste Management Planning--A Methodology

    ERIC Educational Resources Information Center

    Theisen, Hilary M.; And Others

    1975-01-01

    This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)

  17. Six methodological steps to build medical data warehouses for research.

    PubMed

    Szirbik, N B; Pelletier, C; Chaussalet, T

    2006-09-01

    We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.

  18. Quasi-classical approaches to vibronic spectra revisited

    NASA Astrophysics Data System (ADS)

    Karsten, Sven; Ivanov, Sergei D.; Bokarev, Sergey I.; Kühn, Oliver

    2018-03-01

    The framework to approach quasi-classical dynamics in the electronic ground state is well established and is based on the Kubo-transformed time correlation function (TCF), being the most classical-like quantum TCF. Here we discuss whether the choice of the Kubo-transformed TCF as a starting point for simulating vibronic spectra is as unambiguous as it is for vibrational ones. Employing imaginary-time path integral techniques in combination with the interaction representation allowed us to formulate a method for simulating vibronic spectra in the adiabatic regime that takes nuclear quantum effects and dynamics on multiple potential energy surfaces into account. Further, a generalized quantum TCF is proposed that contains many well-established TCFs, including the Kubo one, as particular cases. Importantly, it also provides a framework to construct new quantum TCFs. Applying the developed methodology to the generalized TCF leads to a plethora of simulation protocols, which are based on the well-known TCFs as well as on new ones. Their performance is investigated on 1D anharmonic model systems at finite temperatures. It is shown that the protocols based on the new TCFs may lead to superior results with respect to those based on the common ones. The strategies to find the optimal approach are discussed.

  19. Children facing a family member's acute illness: a review of intervention studies.

    PubMed

    Spath, Mary L

    2007-07-01

    A review of psycho-educational intervention studies to benefit children adapting to a close (parent, sibling, or grandparent) family member's serious illness was conducted. To review the literature on studies addressing this topic, critique research methods, describe clinical outcomes, and make recommendations for future research efforts. Research citations from 1990 to 2005 from Medline, CINAHL, Health Source: Nursing/Academic Edition, PsycARTICLES, and PsycINFO databases were identified. Citations were reviewed and evaluated for sample, design, theoretical framework, intervention, threats to validity, and outcomes. Reviewed studies were limited to those that included statistical analysis to evaluate interventions and outcomes. Six studies were reviewed. Positive outcomes were reported for all of the interventional strategies used in the studies. Reviewed studies generally lacked a theoretical framework and a control group, were generally composed of small convenience samples, and primarily used non-tested investigator instruments. They were diverse in terms of intervention length and intensity, and measured short-term outcomes related to participant program satisfaction, rather than participant cognitive and behavioral change. The paucity of interventional studies and lack of systematic empirical precision to evaluate intervention effectiveness necessitates future studies that are methodologically rigorous.

  20. A human-dimensions review of human-wildlife disturbance: A literature review of impacts, frameworks, and management solutions

    USGS Publications Warehouse

    Cline, Robert; Sexton, Natalie; Stewart, Susan C.

    2007-01-01

    The following report was prepared for the U.S. Fish and Wildlife Service National Refuge System in support of their Comprehensive Conservation Planning (CCP) efforts by the Policy Analysis and Science Assistance Branch (PASA), Fort Collins Science Center, U.S. Geological Survey. While this document provides a summary of contemporary recreation management literature and methodologies, relevant to the subject of managing wildlife disturbances on national wildlife refuges, this document should be viewed as a starting point for management administrators. This document identifies general issues relating to wildlife disturbance and visitor impacts including a description of disturbance, recreational impacts, related human dimensions applications, management frameworks, and a general summary of management solutions. The section on descriptions of wildlife disturbance and impacts draws heavily from the report entitled 'Managing the Impacts of Visitor Use on Waterbirds -- A Literature Review of Impacts and Mitigation' (DeLong, 2002; Delong and Adamcik, in press) and is referenced in the text. This document is more comprehensive in its review of wildlife response to disturbance. This document is intended to discuss the human-dimensions aspect of wildlife disturbance, summarizing human dimensions and recreation management literature as it applies to this topic.

  1. Modeling exposure–lag–response associations with distributed lag non-linear models

    PubMed Central

    Gasparrini, Antonio

    2014-01-01

    In biomedical research, a health effect is frequently associated with protracted exposures of varying intensity sustained in the past. The main complexity of modeling and interpreting such phenomena lies in the additional temporal dimension needed to express the association, as the risk depends on both intensity and timing of past exposures. This type of dependency is defined here as exposure–lag–response association. In this contribution, I illustrate a general statistical framework for such associations, established through the extension of distributed lag non-linear models, originally developed in time series analysis. This modeling class is based on the definition of a cross-basis, obtained by the combination of two functions to flexibly model linear or nonlinear exposure-responses and the lag structure of the relationship, respectively. The methodology is illustrated with an example application to cohort data and validated through a simulation study. This modeling framework generalizes to various study designs and regression models, and can be applied to study the health effects of protracted exposures to environmental factors, drugs or carcinogenic agents, among others. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:24027094

  2. Object-oriented philosophy in designing adaptive finite-element package for 3D elliptic deferential equations

    NASA Astrophysics Data System (ADS)

    Zhengyong, R.; Jingtian, T.; Changsheng, L.; Xiao, X.

    2007-12-01

    Although adaptive finite-element (AFE) analysis is becoming more and more focused in scientific and engineering fields, its efficient implementations are remain to be a discussed problem as its more complex procedures. In this paper, we propose a clear C++ framework implementation to show the powerful properties of Object-oriented philosophy (OOP) in designing such complex adaptive procedure. In terms of the modal functions of OOP language, the whole adaptive system is divided into several separate parts such as the mesh generation or refinement, a-posterior error estimator, adaptive strategy and the final post processing. After proper designs are locally performed on these separate modals, a connected framework of adaptive procedure is formed finally. Based on the general elliptic deferential equation, little efforts should be added in the adaptive framework to do practical simulations. To show the preferable properties of OOP adaptive designing, two numerical examples are tested. The first one is the 3D direct current resistivity problem in which the powerful framework is efficiently shown as only little divisions are added. And then, in the second induced polarization£¨IP£©exploration case, new adaptive procedure is easily added which adequately shows the strong extendibility and re-usage of OOP language. Finally we believe based on the modal framework adaptive implementation by OOP methodology, more advanced adaptive analysis system will be available in future.

  3. Integrated Systems Oriented Student-Centric Learning Environment: A Framework for Curriculum Development

    ERIC Educational Resources Information Center

    Mayur, S. Desai

    2013-01-01

    Purpose: The purpose of this paper is to propose a framework that serves as a guide to develop a curriculum and instructional strategy that is systems oriented and student-centric. Design/methodology/approach: The framework is based on the theories in the field of education by prominent researchers. The framework is divided into four sub-systems,…

  4. A methodology for statewide intermodal freight transportation planning.

    DOT National Transportation Integrated Search

    1998-01-01

    The researchers developed a methodology for statewide freight transportation planning that focuses on identifying and prioritizing infrastructure needs to improve the intermodal freight transportation system. It is designed to provide the framework f...

  5. Revisiting the Concepts "Approach", "Design" and "Procedure" According to the Richards and Rodgers (2011) Framework

    ERIC Educational Resources Information Center

    Cumming, Brett

    2012-01-01

    The three concepts Approach, Design and Procedure as proposed in Rodgers' Framework are considered particularly effective as a framework in second language teaching with the specific aim of developing communication as well as for better understanding methodology in the use of communicative language use.

  6. An Experience-Based Learning Framework: Activities for the Initial Development of Sustainability Competencies

    ERIC Educational Resources Information Center

    Caniglia, Guido; John, Beatrice; Kohler, Martin; Bellina, Leonie; Wiek, Arnim; Rojas, Christopher; Laubichler, Manfred D.; Lang, Daniel

    2016-01-01

    Purpose: This paper aims to present an experience-based learning framework that provides a bottom-up, student-centered entrance point for the development of systems thinking, normative and collaborative competencies in sustainability. Design/methodology/approach: The framework combines mental mapping with exploratory walking. It interweaves…

  7. Translation Accommodations Framework for Testing English Language Learners in Mathematics

    ERIC Educational Resources Information Center

    Solano-Flores, Guillermo

    2012-01-01

    The present framework is developed under contract with the Smarter Balanced Assessment Consortium (SBAC) as a conceptual and methodological tool for guiding the reasonings and actions of contractors in charge of developing and providing test translation accommodations for English language learners. The framework addresses important challenges in…

  8. A user-centred methodology for designing an online social network to motivate health behaviour change.

    PubMed

    Kamal, Noreen; Fels, Sidney

    2013-01-01

    Positive health behaviour is critical to preventing illness and managing chronic conditions. A user-centred methodology was employed to design an online social network to motivate health behaviour change. The methodology was augmented by utilizing the Appeal, Belonging, Commitment (ABC) Framework, which is based on theoretical models for health behaviour change and use of online social networks. The user-centred methodology included four phases: 1) initial user inquiry on health behaviour and use of online social networks; 2) interview feedback on paper prototypes; 2) laboratory study on medium fidelity prototype; and 4) a field study on the high fidelity prototype. The points of inquiry through these phases were based on the ABC Framework. This yielded an online social network system that linked to external third party databases to deploy to users via an interactive website.

  9. Using hidden Markov models and observed evolution to annotate viral genomes.

    PubMed

    McCauley, Stephen; Hein, Jotun

    2006-06-01

    ssRNA (single stranded) viral genomes are generally constrained in length and utilize overlapping reading frames to maximally exploit the coding potential within the genome length restrictions. This overlapping coding phenomenon leads to complex evolutionary constraints operating on the genome. In regions which code for more than one protein, silent mutations in one reading frame generally have a protein coding effect in another. To maximize coding flexibility in all reading frames, overlapping regions are often compositionally biased towards amino acids which are 6-fold degenerate with respect to the 64 codon alphabet. Previous methodologies have used this fact in an ad hoc manner to look for overlapping genes by motif matching. In this paper differentiated nucleotide compositional patterns in overlapping regions are incorporated into a probabilistic hidden Markov model (HMM) framework which is used to annotate ssRNA viral genomes. This work focuses on single sequence annotation and applies an HMM framework to ssRNA viral annotation. A description of how the HMM is parameterized, whilst annotating within a missing data framework is given. A Phylogenetic HMM (Phylo-HMM) extension, as applied to 14 aligned HIV2 sequences is also presented. This evolutionary extension serves as an illustration of the potential of the Phylo-HMM framework for ssRNA viral genomic annotation. The single sequence annotation procedure (SSA) is applied to 14 different strains of the HIV2 virus. Further results on alternative ssRNA viral genomes are presented to illustrate more generally the performance of the method. The results of the SSA method are encouraging however there is still room for improvement, and since there is overwhelming evidence to indicate that comparative methods can improve coding sequence (CDS) annotation, the SSA method is extended to a Phylo-HMM to incorporate evolutionary information. The Phylo-HMM extension is applied to the same set of 14 HIV2 sequences which are pre-aligned. The performance improvement that results from including the evolutionary information in the analysis is illustrated.

  10. Methodological Approaches for Estimating the Benefits and Costs of Smart Grid Demonstration Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Russell

    This report presents a comprehensive framework for estimating the benefits and costs of Smart Grid projects and a step-by-step approach for making these estimates. The framework identifies the basic categories of benefits, the beneficiaries of these benefits, and the Smart Grid functionalities that lead to different benefits and proposes ways to estimate these benefits, including their monetization. The report covers cost-effectiveness evaluation, uncertainty, and issues in estimating baseline conditions against which a project would be compared. The report also suggests metrics suitable for describing principal characteristics of a modern Smart Grid to which a project can contribute. This first sectionmore » of the report presents background information on the motivation for the report and its purpose. Section 2 introduces the methodological framework, focusing on the definition of benefits and a sequential, logical process for estimating them. Beginning with the Smart Grid technologies and functions of a project, it maps these functions to the benefits they produce. Section 3 provides a hypothetical example to illustrate the approach. Section 4 describes each of the 10 steps in the approach. Section 5 covers issues related to estimating benefits of the Smart Grid. Section 6 summarizes the next steps. The methods developed in this study will help improve future estimates - both retrospective and prospective - of the benefits of Smart Grid investments. These benefits, including those to consumers, society in general, and utilities, can then be weighed against the investments. Such methods would be useful in total resource cost tests and in societal versions of such tests. As such, the report will be of interest not only to electric utilities, but also to a broad constituency of stakeholders. Significant aspects of the methodology were used by the U.S. Department of Energy (DOE) to develop its methods for estimating the benefits and costs of its renewable and distributed systems integration demonstration projects as well as its Smart Grid Investment Grant projects and demonstration projects funded under the American Recovery and Reinvestment Act (ARRA). The goal of this report, which was cofunded by the Electric Power Research Institute (EPRI) and DOE, is to present a comprehensive set of methods for estimating the benefits and costs of Smart Grid projects. By publishing this report, EPRI seeks to contribute to the development of methods that will establish the benefits associated with investments in Smart Grid technologies. EPRI does not endorse the contents of this report or make any representations as to the accuracy and appropriateness of its contents. The purpose of this report is to present a methodological framework that will provide a standardized approach for estimating the benefits and costs of Smart Grid demonstration projects. The framework also has broader application to larger projects, such as those funded under the ARRA. Moreover, with additional development, it will provide the means for extrapolating the results of pilots and trials to at-scale investments in Smart Grid technologies. The framework was developed by a panel whose members provided a broad range of expertise.« less

  11. Numerical 3+1 General Relativistic Magnetohydrodynamics: A Local Characteristic Approach

    NASA Astrophysics Data System (ADS)

    Antón, Luis; Zanotti, Olindo; Miralles, Juan A.; Martí, José M.; Ibáñez, José M.; Font, José A.; Pons, José A.

    2006-01-01

    We present a general procedure to solve numerically the general relativistic magnetohydrodynamics (GRMHD) equations within the framework of the 3+1 formalism. The work reported here extends our previous investigation in general relativistic hydrodynamics (Banyuls et al. 1997) where magnetic fields were not considered. The GRMHD equations are written in conservative form to exploit their hyperbolic character in the solution procedure. All theoretical ingredients necessary to build up high-resolution shock-capturing schemes based on the solution of local Riemann problems (i.e., Godunov-type schemes) are described. In particular, we use a renormalized set of regular eigenvectors of the flux Jacobians of the relativistic MHD equations. In addition, the paper describes a procedure based on the equivalence principle of general relativity that allows the use of Riemann solvers designed for special relativistic MHD in GRMHD. Our formulation and numerical methodology are assessed by performing various test simulations recently considered by different authors. These include magnetized shock tubes, spherical accretion onto a Schwarzschild black hole, equatorial accretion onto a Kerr black hole, and magnetized thick disks accreting onto a black hole and subject to the magnetorotational instability.

  12. Curriculum planning for the development of graphicacy capability: three case studies from Europe and the USA

    NASA Astrophysics Data System (ADS)

    Danos, Xenia; Barr, Ronald; Górska, Renata; Norman, Eddie

    2014-11-01

    Curriculum planning for the development of graphicacy capability has not been systematically included in general education to coincide with the graphicacy needs of human society. In higher education, graphicacy curricula have been developed to meet the needs of certain disciplines, for example medical and teacher training and engineering, among others. A framework for graphicacy curricula, anticipating the graphicacy needs in higher education, has yet to be strategically planned for general education. This is partly a result of lack of research effort in this area, but also a result of lack of systematic curriculum planning in general. This paper discusses these issues in the context of graphicacy curricula for engineering. The paper presents three broad individual case studies spanning Europe and the USA, brought together by the common denominator, graphicacy. The case studies are based on: an analysis of graphicacy within general education curricula, an analysis of graphicacy for engineering education in Europe and an analysis of graphicacy for engineering education in the USA. These three papers were originally presented in a plenary session at the American Society for Engineering Education, Engineering Design Graphics Division at the University of Limerick in November 2012. The case studies demonstrate the potential for strategic curriculum planning in regard to the development of graphicacy in general education and an overview of a methodology to achieve that. It also offers further evidence towards the importance of the systematic classification of graphics capabilities in Engineering and how the lack of a developed theoretical framework in this area undermines the case for the importance of graphics within engineering education.

  13. Using Design-Based Research in Gifted Education

    ERIC Educational Resources Information Center

    Jen, Enyi; Moon, Sidney; Samarapungavan, Ala

    2015-01-01

    Design-based research (DBR) is a new methodological framework that was developed in the context of the learning sciences; however, it has not been used very often in the field of gifted education. Compared with other methodologies, DBR is more process-oriented and context-sensitive. In this methodological brief, the authors introduce DBR and…

  14. Linked population and economic models: some methodological issues in forecasting, analysis, and policy optimization.

    PubMed

    Madden, M; Batey Pwj

    1983-05-01

    Some problems associated with demographic-economic forecasting include finding models appropriate for a declining economy with unemployment, using a multiregional approach in an interregional model, finding a way to show differential consumption while endogenizing unemployment, and avoiding unemployment inconsistencies. The solution to these problems involves the construction of an activity-commodity framework, locating it within a group of forecasting models, and indicating possible ratios towards dynamization of the framework. The authors demonstrate the range of impact multipliers that can be derived from the framework and show how these multipliers relate to Leontief input-output multipliers. It is shown that desired population distribution may be obtained by selecting instruments from the economic sphere to produce, through the constraints vector of an activity-commodity framework, targets selected from demographic activities. The next step in this process, empirical exploitation, was carried out by the authors in the United Kingdom, linking an input-output model with a wide selection of demographic and demographic-economic variables. The generally tenuous control which government has over any variables in systems of this type, especially in market economies, makes application in the policy field of the optimization approach a partly conjectural exercise, although the analytic capacity of the approach can provide clear indications of policy directions.

  15. A framework for multi-stakeholder decision-making and ...

    EPA Pesticide Factsheets

    We propose a decision-making framework to compute compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives. In our setting, we shape the stakeholder dis-satisfaction distribution by solving a conditional-value-at-risk (CVaR) minimization problem. The CVaR problem is parameterized by a probability level that shapes the tail of the dissatisfaction distribution. The proposed approach allows us to compute a family of compromise solutions and generalizes multi-stakeholder settings previously proposed in the literature that minimize average and worst-case dissatisfactions. We use the concept of the CVaR norm to give a geometric interpretation to this problem +and use the properties of this norm to prove that the CVaR minimization problem yields Pareto optimal solutions for any choice of the probability level. We discuss a broad range of potential applications of the framework that involve complex decision-making processes. We demonstrate the developments using a biowaste facility location case study in which we seek to balance stakeholder priorities on transportation, safety, water quality, and capital costs. This manuscript describes the methodology of a new decision-making framework that computes compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives as needed for SHC Decision Science and Support Tools project. A biowaste facility location is employed as the case study

  16. Web Based Prognostics and 24/7 Monitoring

    NASA Technical Reports Server (NTRS)

    Strautkalns, Miryam; Robinson, Peter

    2013-01-01

    We created a general framework for analysts to store and view data in a way that removes the boundaries created by operating systems, programming languages, and proximity. With the advent of HTML5 and CSS3 with JavaScript the distribution of information is limited to only those who lack a browser. We created a framework based on the methodology: one server, one web based application. Additional benefits are increased opportunities for collaboration. Today the idea of a group in a single room is antiquated. Groups will communicate and collaborate with others from other universities, organizations, as well as other continents across times zones. There are many varieties of data gathering and condition-monitoring software available as well as companies who specialize in customizing software to individual applications. One single group will depend on multiple languages, environments, and computers to oversee recording and collaborating with one another in a single lab. The heterogeneous nature of the system creates challenges for seamless exchange of data and ideas between members. To address these limitations we designed a framework to allow users seamless accessibility to their data. Our framework was deployed using the data feed on the NASA Ames' planetary rover testbed. Our paper demonstrates the process and implementation we followed on the rover.

  17. A novel water quality data analysis framework based on time-series data mining.

    PubMed

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. A risk assessment methodology for critical transportation infrastructure.

    DOT National Transportation Integrated Search

    2002-01-01

    Infrastructure protection typifies a problem of risk assessment and management in a large-scale system. This study offers a methodological framework to identify, prioritize, assess, and manage risks. It includes the following major considerations: (1...

  19. Global dynamic optimization approach to predict activation in metabolic pathways.

    PubMed

    de Hijas-Liste, Gundián M; Klipp, Edda; Balsa-Canto, Eva; Banga, Julio R

    2014-01-06

    During the last decade, a number of authors have shown that the genetic regulation of metabolic networks may follow optimality principles. Optimal control theory has been successfully used to compute optimal enzyme profiles considering simple metabolic pathways. However, applying this optimal control framework to more general networks (e.g. branched networks, or networks incorporating enzyme production dynamics) yields problems that are analytically intractable and/or numerically very challenging. Further, these previous studies have only considered a single-objective framework. In this work we consider a more general multi-objective formulation and we present solutions based on recent developments in global dynamic optimization techniques. We illustrate the performance and capabilities of these techniques considering two sets of problems. First, we consider a set of single-objective examples of increasing complexity taken from the recent literature. We analyze the multimodal character of the associated non linear optimization problems, and we also evaluate different global optimization approaches in terms of numerical robustness, efficiency and scalability. Second, we consider generalized multi-objective formulations for several examples, and we show how this framework results in more biologically meaningful results. The proposed strategy was used to solve a set of single-objective case studies related to unbranched and branched metabolic networks of different levels of complexity. All problems were successfully solved in reasonable computation times with our global dynamic optimization approach, reaching solutions which were comparable or better than those reported in previous literature. Further, we considered, for the first time, multi-objective formulations, illustrating how activation in metabolic pathways can be explained in terms of the best trade-offs between conflicting objectives. This new methodology can be applied to metabolic networks with arbitrary topologies, non-linear dynamics and constraints.

  20. A Bayesian Framework for Generalized Linear Mixed Modeling Identifies New Candidate Loci for Late-Onset Alzheimer’s Disease

    PubMed Central

    Wang, Xulong; Philip, Vivek M.; Ananda, Guruprasad; White, Charles C.; Malhotra, Ankit; Michalski, Paul J.; Karuturi, Krishna R. Murthy; Chintalapudi, Sumana R.; Acklin, Casey; Sasner, Michael; Bennett, David A.; De Jager, Philip L.; Howell, Gareth R.; Carter, Gregory W.

    2018-01-01

    Recent technical and methodological advances have greatly enhanced genome-wide association studies (GWAS). The advent of low-cost, whole-genome sequencing facilitates high-resolution variant identification, and the development of linear mixed models (LMM) allows improved identification of putatively causal variants. While essential for correcting false positive associations due to sample relatedness and population stratification, LMMs have commonly been restricted to quantitative variables. However, phenotypic traits in association studies are often categorical, coded as binary case-control or ordered variables describing disease stages. To address these issues, we have devised a method for genomic association studies that implements a generalized LMM (GLMM) in a Bayesian framework, called Bayes-GLMM. Bayes-GLMM has four major features: (1) support of categorical, binary, and quantitative variables; (2) cohesive integration of previous GWAS results for related traits; (3) correction for sample relatedness by mixed modeling; and (4) model estimation by both Markov chain Monte Carlo sampling and maximal likelihood estimation. We applied Bayes-GLMM to the whole-genome sequencing cohort of the Alzheimer’s Disease Sequencing Project. This study contains 570 individuals from 111 families, each with Alzheimer’s disease diagnosed at one of four confidence levels. Using Bayes-GLMM we identified four variants in three loci significantly associated with Alzheimer’s disease. Two variants, rs140233081 and rs149372995, lie between PRKAR1B and PDGFA. The coded proteins are localized to the glial-vascular unit, and PDGFA transcript levels are associated with Alzheimer’s disease-related neuropathology. In summary, this work provides implementation of a flexible, generalized mixed-model approach in a Bayesian framework for association studies. PMID:29507048

  1. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  2. A Methodological Framework to Estimate the Site Fidelity of Tagged Animals Using Passive Acoustic Telemetry

    PubMed Central

    Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent

    2015-01-01

    The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity (“residence times”) of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales. PMID:26261985

  3. A Methodological Framework to Estimate the Site Fidelity of Tagged Animals Using Passive Acoustic Telemetry.

    PubMed

    Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent

    2015-01-01

    The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity ("residence times") of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales.

  4. Patient influences on satisfaction and loyalty for GP services.

    PubMed

    Rundle-Thiele, Sharyn; Russell-Bennett, Rebekah

    2010-04-01

    Little is known about the influence that patients themselves have on their loyalty to a general practitioner (GP). Consequently, a theoretical framework that draws on diverse literature is proposed to suggest that along with satisfaction, patient loyalty is an important outcome for GPs. Comprising 174 Australian patients, this study identified that knowledgeable patients reported lower levels of loyalty while older patients and patients visiting a GP more frequently reported higher levels of loyalty. The results suggest that extending patient-centered care practices to encompass all patients may be warranted in order to improve patient satisfaction and loyalty. Further, future research opportunities abound, with intervention and dyadic research methodologies recommended.

  5. On-Orbit Range Set Applications

    NASA Astrophysics Data System (ADS)

    Holzinger, M.; Scheeres, D.

    2011-09-01

    History and methodology of Δv range set computation is briefly reviewed, followed by a short summary of the Δv optimal spacecraft servicing problem literature. Service vehicle placement is approached from a Δv range set viewpoint, providing a framework under which the analysis becomes quite geometric and intuitive. The optimal servicing tour design problem is shown to be a specific instantiation of the metric- Traveling Salesman Problem (TSP), which in general is an NP-hard problem. The Δv-TSP is argued to be quite similar to the Euclidean-TSP, for which approximate optimal solutions may be found in polynomial time. Applications of range sets are demonstrated using analytical and simulation results.

  6. A quantum framework for likelihood ratios

    NASA Astrophysics Data System (ADS)

    Bond, Rachael L.; He, Yang-Hui; Ormerod, Thomas C.

    The ability to calculate precise likelihood ratios is fundamental to science, from Quantum Information Theory through to Quantum State Estimation. However, there is no assumption-free statistical methodology to achieve this. For instance, in the absence of data relating to covariate overlap, the widely used Bayes’ theorem either defaults to the marginal probability driven “naive Bayes’ classifier”, or requires the use of compensatory expectation-maximization techniques. This paper takes an information-theoretic approach in developing a new statistical formula for the calculation of likelihood ratios based on the principles of quantum entanglement, and demonstrates that Bayes’ theorem is a special case of a more general quantum mechanical expression.

  7. Spatial Operator Algebra for multibody system dynamics

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Jain, A.; Kreutz-Delgado, K.

    1992-01-01

    The Spatial Operator Algebra framework for the dynamics of general multibody systems is described. The use of a spatial operator-based methodology permits the formulation of the dynamical equations of motion of multibody systems in a concise and systematic way. The dynamical equations of progressively more complex grid multibody systems are developed in an evolutionary manner beginning with a serial chain system, followed by a tree topology system and finally, systems with arbitrary closed loops. Operator factorizations and identities are used to develop novel recursive algorithms for the forward dynamics of systems with closed loops. Extensions required to deal with flexible elements are also discussed.

  8. Pareto frontier analyses based decision making tool for transportation of hazardous waste.

    PubMed

    Das, Arup; Mazumder, T N; Gupta, A K

    2012-08-15

    Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Ultra-Structure database design methodology for managing systems biology data and analyses

    PubMed Central

    Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C

    2009-01-01

    Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849

  10. A framework to assess the impacts of Climate Change for different hazards at local and regional scale through probabilistic multi-model approaches

    NASA Astrophysics Data System (ADS)

    Mercogliano, P.; Reder, A.; Rianna, G.

    2017-12-01

    Extreme weather events (EWEs) are projected to be more frequent and severe across the globe because of global warming. This poses challenging problems for critical infrastructures (CIs) which can be dramatically affected by EWEs needing adaptation countermeasures againts changing climate conditions. In this work, we present the main results achieved in the framework of the FP7-European project INTACT aimed at analyzing the resilience of CIs against shocks and stresses due to the climate changes. To identify variations in the hazard induced by climate change, appropriate Extreme Weather Indicators (EWIs) are defined for several case studies and different approaches are analyzed to obtain local climate projections. The different approaches, with increasing refinement depending on local information available and methodologies selected, are investigated considering raw versus bias corrected data and weighted or equiprobable ensemble mean projections given by the regional climate models within the Euro-CORDEX program. Specifically, this work focuses on two case studies selected from the five proposed within the INTACT project and for which local station data are available: • rainfall-induced landslide affecting Campania region (Southern Italy) with a special view on the Nocera municipality; • storms and heavy rainfall/winds in port of Rotterdam (Netherlands). In general, our results show a small sensitivity to the weighting approach and a large sensitivity to bias-correction in the future projections. For landslides in Campania region, the Euro-CORDEX simulations projected a generalized worsening of the safety conditions depending on the scenario (RCP4.5/8.5) and period (2011-2040/2041-2070/2071-2100) considered. For the port of Rotterdam, the Euro-CORDEX simulations projected an increment in the intense events of daily and weekly precipitation, also in this case depending on the scenario and period considered. Considering framework, methodologies and results, the activities developed within the INTACT project, also through an intense effort of knowledge co-production between researchers and stakeholders, posed a theoretical-based starting point for CI owners, operators and protection policy makers for the setup of protection systems against present and future climatic hazard features.

  11. When is good, good enough? Methodological pragmatism for sustainable guideline development.

    PubMed

    Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C

    2015-03-06

    Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.

  12. Valuing public sector risk exposure in transportation public-private partnerships.

    DOT National Transportation Integrated Search

    2010-10-01

    This report presents a methodological framework to evaluate public sector financial risk exposure when : delivering transportation infrastructure through public-private partnership (PPP) agreements in the United : States (U.S.). The framework is base...

  13. Distance Learning Courses on the Web: The Authoring Approach.

    ERIC Educational Resources Information Center

    Santos, Neide; Diaz, Alicia; Bibbo, Luis Mariano

    This paper proposes a framework for supporting the authoring process of distance learning courses. An overview of distance learning courses and the World Wide Web is presented. The proposed framework is then described, including: (1) components of the framework--a hypermedia design methodology for authoring the course, links to related Web sites,…

  14. Problem Solving Frameworks for Mathematics and Software Development

    ERIC Educational Resources Information Center

    McMaster, Kirby; Sambasivam, Samuel; Blake, Ashley

    2012-01-01

    In this research, we examine how problem solving frameworks differ between Mathematics and Software Development. Our methodology is based on the assumption that the words used frequently in a book indicate the mental framework of the author. We compared word frequencies in a sample of 139 books that discuss problem solving. The books were grouped…

  15. Evidence-Based Leadership Development: The 4L Framework

    ERIC Educational Resources Information Center

    Scott, Shelleyann; Webber, Charles F.

    2008-01-01

    Purpose: This paper aims to use the results of three research initiatives to present the life-long learning leader 4L framework, a model for leadership development intended for use by designers and providers of leadership development programming. Design/methodology/approach: The 4L model is a conceptual framework that emerged from the analysis of…

  16. Sequential Schooling or Lifelong Learning? International Frameworks through the Lens of English Higher Professional and Vocational Education

    ERIC Educational Resources Information Center

    Lester, Stan

    2018-01-01

    Purpose: The purpose of this paper is to review three international frameworks, including the International Standard Classification of Education (ISCED), in relation to one country's higher professional and vocational education system. Design/methodology/approach: The frameworks were examined in the context of English higher work-related…

  17. Synthesizing Middle Grades Research on Cultural Responsiveness: The Importance of a Shared Conceptual Framework

    ERIC Educational Resources Information Center

    Kennedy, Brianna L.; Brinegar, Kathleen; Hurd, Ellis; Harrison, Lisa

    2016-01-01

    In conducting a literature review of 133 articles on cultural responsiveness in middle level education, we identified a lack of shared definitions, theoretical frameworks, methodological approaches, and foci, which made it difficult to synthesize across articles. Using a conceptual framework that required: a) clear definitions of terms; b) a…

  18. European Qualifications Framework: Weighing Some Pros and Cons out of a French Perspective

    ERIC Educational Resources Information Center

    Bouder, Annie

    2008-01-01

    Purpose: The purpose of this paper is to question the appropriateness of a proposal for a new European Qualifications Framework. The framework has three perspectives: historical; analytical; and national. Design/methodology/approach: The approaches are diverse since the first insists on the institutional and decision-making processes at European…

  19. A comprehensive risk assessment framework for offsite transportation of inflammable hazardous waste.

    PubMed

    Das, Arup; Gupta, A K; Mazumder, T N

    2012-08-15

    A framework for risk assessment due to offsite transportation of hazardous wastes is designed based on the type of event that can be triggered from an accident of a hazardous waste carrier. The objective of this study is to design a framework for computing the risk to population associated with offsite transportation of inflammable and volatile wastes. The framework is based on traditional definition of risk and is designed for conditions where accident databases are not available. The probability based variable in risk assessment framework is substituted by a composite accident index proposed in this study. The framework computes the impacts due to a volatile cloud explosion based on TNO Multi-energy model. The methodology also estimates the vulnerable population in terms of disability adjusted life years (DALY) which takes into consideration the demographic profile of the population and the degree of injury on mortality and morbidity sustained. The methodology is illustrated using a case study of a pharmaceutical industry in the Kolkata metropolitan area. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. A mathematical framework for combining decisions of multiple experts toward accurate and remote diagnosis of malaria using tele-microscopy.

    PubMed

    Mavandadi, Sam; Feng, Steve; Yu, Frank; Dimitrov, Stoyan; Nielsen-Saines, Karin; Prescott, William R; Ozcan, Aydogan

    2012-01-01

    We propose a methodology for digitally fusing diagnostic decisions made by multiple medical experts in order to improve accuracy of diagnosis. Toward this goal, we report an experimental study involving nine experts, where each one was given more than 8,000 digital microscopic images of individual human red blood cells and asked to identify malaria infected cells. The results of this experiment reveal that even highly trained medical experts are not always self-consistent in their diagnostic decisions and that there exists a fair level of disagreement among experts, even for binary decisions (i.e., infected vs. uninfected). To tackle this general medical diagnosis problem, we propose a probabilistic algorithm to fuse the decisions made by trained medical experts to robustly achieve higher levels of accuracy when compared to individual experts making such decisions. By modelling the decisions of experts as a three component mixture model and solving for the underlying parameters using the Expectation Maximisation algorithm, we demonstrate the efficacy of our approach which significantly improves the overall diagnostic accuracy of malaria infected cells. Additionally, we present a mathematical framework for performing 'slide-level' diagnosis by using individual 'cell-level' diagnosis data, shedding more light on the statistical rules that should govern the routine practice in examination of e.g., thin blood smear samples. This framework could be generalized for various other tele-pathology needs, and can be used by trained experts within an efficient tele-medicine platform.

  1. A determination of the fragmentation functions of pions, kaons, and protons with faithful uncertainties. The NNPDF Collaboration

    NASA Astrophysics Data System (ADS)

    Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.; Nocera, Emanuele R.; Rojo, Juan

    2017-08-01

    We present NNFF1.0, a new determination of the fragmentation functions (FFs) of charged pions, charged kaons, and protons/antiprotons from an analysis of single-inclusive hadron production data in electron-positron annihilation. This determination, performed at leading, next-to-leading, and next-to-next-to-leading order in perturbative QCD, is based on the NNPDF methodology, a fitting framework designed to provide a statistically sound representation of FF uncertainties and to minimise any procedural bias. We discuss novel aspects of the methodology used in this analysis, namely an optimised parametrisation of FFs and a more efficient χ ^2 minimisation strategy, and validate the FF fitting procedure by means of closure tests. We then present the NNFF1.0 sets, and discuss their fit quality, their perturbative convergence, and their stability upon variations of the kinematic cuts and the fitted dataset. We find that the systematic inclusion of higher-order QCD corrections significantly improves the description of the data, especially in the small- z region. We compare the NNFF1.0 sets to other recent sets of FFs, finding in general a reasonable agreement, but also important differences. Together with existing sets of unpolarised and polarised parton distribution functions (PDFs), FFs and PDFs are now available from a common fitting framework for the first time.

  2. Epidemiology of multiple chronic conditions: an international perspective.

    PubMed

    Schellevis, François G

    2013-01-01

    The epidemiology of multimorbidity, or multiple chronic conditions (MCCs), is one of the research priority areas of the U.S. Department of Health and Human Services (HHS) by its Strategic Framework on MCCs. A conceptual model addressing methodological issues leading to a valid measurement of the prevalence rates of MCCs has been developed and applied in descriptive epidemiological studies. Comparing these results with those from prevalence studies performed earlier and in other countries is hampered by methodological limitations. Therefore, this paper aims to put the size and patterns of MCCs in the USA, as established within the HHS Strategic Framework on MCCs, in perspective of the findings on the prevalence of MCCs in other countries. General common trends can be observed: increasing prevalence rates with increasing age, and multimorbidity being the rule rather than the exception at old age. Most frequent combinations of chronic diseases include the most frequently occurring single chronic diseases. New descriptive epidemiological studies will probably not provide new results; therefore, future descriptive studies should focus on the prevalence rates of MCCs in subpopulations, statistical clustering of chronic conditions, and the development of the prevalence rates of MCCs over time. The finding of common trends also indicates the necessary transition to a next phase of MCC research, addressing the quality of care of patients with MCCs from an organizational perspective and with respect to the content of care. Journal of Comorbidity 2013;3:36-40.

  3. Generative models for clinical applications in computational psychiatry.

    PubMed

    Frässle, Stefan; Yao, Yu; Schöbi, Dario; Aponte, Eduardo A; Heinzle, Jakob; Stephan, Klaas E

    2018-05-01

    Despite the success of modern neuroimaging techniques in furthering our understanding of cognitive and pathophysiological processes, translation of these advances into clinically relevant tools has been virtually absent until now. Neuromodeling represents a powerful framework for overcoming this translational deadlock, and the development of computational models to solve clinical problems has become a major scientific goal over the last decade, as reflected by the emergence of clinically oriented neuromodeling fields like Computational Psychiatry, Computational Neurology, and Computational Psychosomatics. Generative models of brain physiology and connectivity in the human brain play a key role in this endeavor, striving for computational assays that can be applied to neuroimaging data from individual patients for differential diagnosis and treatment prediction. In this review, we focus on dynamic causal modeling (DCM) and its use for Computational Psychiatry. DCM is a widely used generative modeling framework for functional magnetic resonance imaging (fMRI) and magneto-/electroencephalography (M/EEG) data. This article reviews the basic concepts of DCM, revisits examples where it has proven valuable for addressing clinically relevant questions, and critically discusses methodological challenges and recent methodological advances. We conclude this review with a more general discussion of the promises and pitfalls of generative models in Computational Psychiatry and highlight the path that lies ahead of us. This article is categorized under: Neuroscience > Computation Neuroscience > Clinical Neuroscience. © 2018 Wiley Periodicals, Inc.

  4. Examining the effectiveness of municipal solid waste management systems: an integrated cost-benefit analysis perspective with a financial cost modeling in Taiwan.

    PubMed

    Weng, Yu-Chi; Fujiwara, Takeshi

    2011-06-01

    In order to develop a sound material-cycle society, cost-effective municipal solid waste (MSW) management systems are required for the municipalities in the context of the integrated accounting system for MSW management. Firstly, this paper attempts to establish an integrated cost-benefit analysis (CBA) framework for evaluating the effectiveness of MSW management systems. In this paper, detailed cost/benefit items due to waste problems are particularly clarified. The stakeholders of MSW management systems, including the decision-makers of the municipalities and the citizens, are expected to reconsider the waste problems in depth and thus take wise actions with the aid of the proposed CBA framework. Secondly, focusing on the financial cost, this study develops a generalized methodology to evaluate the financial cost-effectiveness of MSW management systems, simultaneously considering the treatment technological levels and policy effects. The impacts of the influencing factors on the annual total and average financial MSW operation and maintenance (O&M) costs are analyzed in the Taiwanese case study with a demonstrative short-term future projection of the financial costs under scenario analysis. The established methodology would contribute to the evaluation of the current policy measures and to the modification of the policy design for the municipalities. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  5. The effectiveness of scoliosis screening programs: methods for systematic review and expert panel recommendations formulation

    PubMed Central

    2013-01-01

    Background Literature on scoliosis screening is vast, however because of the observational nature of available data and methodological flaws, data interpretation is often complex, leading to incomplete and sometimes, somewhat misleading conclusions. The need to propose a set of methods for critical appraisal of the literature about scoliosis screening, a comprehensive summary and rating of the available evidence appeared essential. Methods To address these gaps, the study aims were: i) To propose a framework for the assessment of published studies on scoliosis screening effectiveness; ii) To suggest specific questions to be answered on screening effectiveness instead of trying to reach a global position for or against the programs; iii) To contextualize the knowledge through expert panel consultation and meaningful recommendations. The general methodological approach proceeds through the following steps: Elaboration of the conceptual framework; Formulation of the review questions; Identification of the criteria for the review; Selection of the studies; Critical assessment of the studies; Results synthesis; Formulation and grading of recommendations in response to the questions. This plan follows at best GRADE Group (Grades of Recommendation, Assessment, Development and Evaluation) requirements for systematic reviews, assessing quality of evidence and grading the strength of recommendations. Conclusions In this article, the methods developed in support of this work are presented since they may be of some interest for similar reviews in scoliosis and orthopaedic fields. PMID:23883346

  6. Recommendations for benefit-risk assessment methodologies and visual representations.

    PubMed

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain

    2016-03-01

    The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Social research design: framework for integrating philosophical and practical elements.

    PubMed

    Cunningham, Kathryn Burns

    2014-09-01

    To provide and elucidate a comprehensible framework for the design of social research. An abundance of information exists concerning the process of designing social research. The overall message that can be gleaned is that numerable elements - both philosophical (ontological and epistemological assumptions and theoretical perspective) and practical (issue to be addressed, purpose, aims and research questions) - are influential in the process of selecting a research methodology and methods, and that these elements and their inter-relationships must be considered and explicated to ensure a coherent research design that enables well-founded and meaningful conclusions. There is a lack of guidance concerning the integration of practical and philosophical elements, hindering their consideration and explication. The author's PhD research into loneliness and cancer. This is a methodology paper. A guiding framework that incorporates all of the philosophical and practical elements influential in social research design is presented. The chronological and informative relationships between the elements are discussed. The framework presented can be used by social researchers to consider and explicate the practical and philosophical elements influential in the selection of a methodology and methods. It is hoped that the framework presented will aid social researchers with the design and the explication of the design of their research, thereby enhancing the credibility of their projects and enabling their research to establish well-founded and meaningful conclusions.

  8. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  9. Grounded theory as a method for research in speech and language therapy.

    PubMed

    Skeat, J; Perry, A

    2008-01-01

    The use of qualitative methodologies in speech and language therapy has grown over the past two decades, and there is now a body of literature, both generally describing qualitative research, and detailing its applicability to health practice(s). However, there has been only limited profession-specific discussion of qualitative methodologies and their potential application to speech and language therapy. To describe the methodology of grounded theory, and to explain how it might usefully be applied to areas of speech and language research where theoretical frameworks or models are lacking. Grounded theory as a methodology for inductive theory-building from qualitative data is explained and discussed. Some differences between 'modes' of grounded theory are clarified and areas of controversy within the literature are highlighted. The past application of grounded theory to speech and language therapy, and its potential for informing research and clinical practice, are examined. This paper provides an in-depth critique of a qualitative research methodology, including an overview of the main difference between two major 'modes'. The article supports the application of a theory-building approach in the profession, which is sometimes complex to learn and apply, but worthwhile in its results. Grounded theory as a methodology has much to offer speech and language therapists and researchers. Although the majority of research and discussion around this methodology has rested within sociology and nursing, grounded theory can be applied by researchers in any field, including speech and language therapists. The benefit of the grounded theory method to researchers and practitioners lies in its application to social processes and human interactions. The resulting theory may support further research in the speech and language therapy profession.

  10. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  11. [Conceptual and methodological issues involved in the research field of diagnostic reasoning].

    PubMed

    Di Persia, Francisco N

    2016-05-01

    The psychopathological field is crossed by dilemmas that put in question its methodological, conceptual and philosophical filiations. Since the early works of Ey and Jaspers until recent work of Berrios it has been in question the position psychopathology has in the field of medicine in general, and in the field of psychiatry in particular, especially if it should follow the principles of natural science or if it has an autonomous position between them. This debate has led to two opposing positions facing two different models of psychopathology: the biomedical model and the socio-constructionist model. In this work it is proposed to review the scope and difficulties involved in each model following two central axes: diagnostic reasoning and mental illness conceptual problem. Later, as a synthesis of the analysis proposed they are identified central concepts of each model that could allow the development of a hybrid model in psychopathology; in between them the comprehensive framework employed in symptoms recognition and the social component that characterizes it are highlighted. As a conclusion, these concepts are proposed as central aspects for conceptual and methodological clarification of the research field of diagnostic reasoning in psychopathology.

  12. Assessment of perceptions of clinical management in courses oriented by competency.

    PubMed

    Gomes, Romeu; Padilha, Roberto de Queiroz; Lima, Valéria Vernaschi; Silva, Cosme Marcelo Furtado Passos da

    2018-01-01

    The study aims to assess perceptions of mastery of abilities in clinical management in participants of courses oriented by competency and based on active methodologies of teaching and learning, before and after the offered training process. Three conceptual frameworks were utilized: clinical management, expectation of auto-efficacy, and the holistic concept of competency. Methodologically, an electronic instrument was made available to students of the training courses, adapted to the Likert scale, in two stages: before the courses were undertaken and after their completion. The group of subjects that participated simultaneously in both stages was comprised of 825 trainees. Average, mean, standard deviation, and the Wilcoxon test were utilized in the analysis. Generally, in terms of findings, the perception of mastery of abilities in clinical management increased after the courses, proving a positive contribution of the training process of the students. Among other aspects of their results, it is concluded that the educational initiatives studied, oriented by competency and based in active methodologies of teaching and learning, can obtain the increase in perception of their participants regarding the mastery of abilities present in the competency profile, confirming the study's hypothesis.

  13. Multi-scale Gaussian representation and outline-learning based cell image segmentation.

    PubMed

    Farhan, Muhammad; Ruusuvuori, Pekka; Emmenlauer, Mario; Rämö, Pauli; Dehio, Christoph; Yli-Harja, Olli

    2013-01-01

    High-throughput genome-wide screening to study gene-specific functions, e.g. for drug discovery, demands fast automated image analysis methods to assist in unraveling the full potential of such studies. Image segmentation is typically at the forefront of such analysis as the performance of the subsequent steps, for example, cell classification, cell tracking etc., often relies on the results of segmentation. We present a cell cytoplasm segmentation framework which first separates cell cytoplasm from image background using novel approach of image enhancement and coefficient of variation of multi-scale Gaussian scale-space representation. A novel outline-learning based classification method is developed using regularized logistic regression with embedded feature selection which classifies image pixels as outline/non-outline to give cytoplasm outlines. Refinement of the detected outlines to separate cells from each other is performed in a post-processing step where the nuclei segmentation is used as contextual information. We evaluate the proposed segmentation methodology using two challenging test cases, presenting images with completely different characteristics, with cells of varying size, shape, texture and degrees of overlap. The feature selection and classification framework for outline detection produces very simple sparse models which use only a small subset of the large, generic feature set, that is, only 7 and 5 features for the two cases. Quantitative comparison of the results for the two test cases against state-of-the-art methods show that our methodology outperforms them with an increase of 4-9% in segmentation accuracy with maximum accuracy of 93%. Finally, the results obtained for diverse datasets demonstrate that our framework not only produces accurate segmentation but also generalizes well to different segmentation tasks.

  14. Multi-scale Gaussian representation and outline-learning based cell image segmentation

    PubMed Central

    2013-01-01

    Background High-throughput genome-wide screening to study gene-specific functions, e.g. for drug discovery, demands fast automated image analysis methods to assist in unraveling the full potential of such studies. Image segmentation is typically at the forefront of such analysis as the performance of the subsequent steps, for example, cell classification, cell tracking etc., often relies on the results of segmentation. Methods We present a cell cytoplasm segmentation framework which first separates cell cytoplasm from image background using novel approach of image enhancement and coefficient of variation of multi-scale Gaussian scale-space representation. A novel outline-learning based classification method is developed using regularized logistic regression with embedded feature selection which classifies image pixels as outline/non-outline to give cytoplasm outlines. Refinement of the detected outlines to separate cells from each other is performed in a post-processing step where the nuclei segmentation is used as contextual information. Results and conclusions We evaluate the proposed segmentation methodology using two challenging test cases, presenting images with completely different characteristics, with cells of varying size, shape, texture and degrees of overlap. The feature selection and classification framework for outline detection produces very simple sparse models which use only a small subset of the large, generic feature set, that is, only 7 and 5 features for the two cases. Quantitative comparison of the results for the two test cases against state-of-the-art methods show that our methodology outperforms them with an increase of 4-9% in segmentation accuracy with maximum accuracy of 93%. Finally, the results obtained for diverse datasets demonstrate that our framework not only produces accurate segmentation but also generalizes well to different segmentation tasks. PMID:24267488

  15. Applying the AcciMap methodology to investigate the tragic Sewol Ferry accident in South Korea.

    PubMed

    Lee, Samuel; Moh, Young Bo; Tabibzadeh, Maryam; Meshkati, Najmedin

    2017-03-01

    This study applies the AcciMap methodology, which was originally proposed by Professor Jens Rasmussen (1997), to the analysis of the tragic Sewol Ferry accident in South Korea on April 16, 2014, which killed 304 mostly young people and is considered as a national disaster in that country. This graphical representation, by incorporating associated socio-technical factors into an integrated framework, provides a big-picture to illustrate the context in which an accident occurred as well as the interactions between different levels of the studied system that resulted in that event. In general, analysis of past accidents within the stated framework can define the patterns of hazards within an industrial sector. Such analysis can lead to the definition of preconditions for safe operations, which is a main focus of proactive risk management systems. In the case of the Sewol Ferry accident, a lot of the blame has been placed on the Sewol's captain and its crewmembers. However, according to this study, which relied on analyzing all available sources published in English and Korean, the disaster is the result of a series of lapses and disregards for safety across different levels of government and regulatory bodies, Chonghaejin Company, and the Sewol's crewmembers. The primary layers of the AcciMap framework, which include the political environment and non-proactive governmental body; inadequate regulations and their lax oversight and enforcement; poor safety culture; inconsideration of human factors issues; and lack of and/or outdated standard operating and emergency procedures were not only limited to the maritime industry in South Korea, and the Sewol Ferry accident, but they could also subject any safety-sensitive industry anywhere in the world. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Investigating transport pathways in the ocean

    NASA Astrophysics Data System (ADS)

    Griffa, Annalisa; Haza, Angelique; Özgökmen, Tamay M.; Molcard, Anne; Taillandier, Vincent; Schroeder, Katrin; Chang, Yeon; Poulain, P.-M.

    2013-01-01

    The ocean is a very complex medium with scales of motion that range from thousands of kilometers to the dissipation scales. Transport by ocean currents plays an important role in many practical applications ranging from climatic problems to coastal management and accident mitigation at sea. Understanding transport is challenging because of the chaotic nature of particle motion. In the last decade, new methods have been put forth to improve our understanding of transport. Powerful tools are provided by dynamical system theory, that allow the identification of the barriers to transport and their time variability for a given flow. A shortcoming of this approach, though, is that it is based on the assumption that the velocity field is known with good accuracy, which is not always the case in practical applications. Improving model performance in terms of transport can be addressed using another important methodology that has been recently developed, namely the assimilation of Lagrangian data provided by floating buoys. The two methodologies are technically different but in many ways complementary. In this paper, we review examples of applications of both methodologies performed by the authors in the last few years, considering flows at different scales and in various ocean basins. The results are among the very first examples of applications of the methodologies to the real ocean including testing with Lagrangian in-situ data. The results are discussed in the general framework of the extended fields related to these methodologies, pointing out to open questions and potential for improvements, with an outlook toward future strategies.

  17. Methodologies in Cultural-Historical Activity Theory: The Example of School-Based Development

    ERIC Educational Resources Information Center

    Postholm, May Britt

    2015-01-01

    Background and purpose: Relatively little research has been conducted on methodology within Cultural-Historical Activity Theory (CHAT). CHAT is mainly used as a framework for developmental processes. The purpose of this article is to discuss both focuses for research and research questions within CHAT and to outline methodologies that can be used…

  18. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    ERIC Educational Resources Information Center

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  19. Routes of Knowledge: Toward a Methodological Framework for Tracing the Historical Impact of International Organizations

    ERIC Educational Resources Information Center

    Christensen, Ivan Lind; Ydesen, Christian

    2015-01-01

    Recent trends in the historiography of international organizations are occupied with tracing their historical impact on national contexts. There is, however, no consensus on how to conduct this type of analysis methodologically. This article examines the methodological challenges arising from this type of research. While a great deal of…

  20. A methodology for evaluation of a markup-based specification of clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a three-phase, nine-step methodology for specification of clinical guidelines (GLs) by expert physicians, clinical editors, and knowledge engineers, and for quantitative evaluation of the specification's quality. We applied this methodology to a particular framework for incremental GL structuring (mark-up) and to GLs in three clinical domains with encouraging results.

  1. DRUG EVALUATION AND DECISION MAKING IN CATALONIA: DEVELOPMENT AND VALIDATION OF A METHODOLOGICAL FRAMEWORK BASED ON MULTI-CRITERIA DECISION ANALYSIS (MCDA) FOR ORPHAN DRUGS.

    PubMed

    Gilabert-Perramon, Antoni; Torrent-Farnell, Josep; Catalan, Arancha; Prat, Alba; Fontanet, Manel; Puig-Peiró, Ruth; Merino-Montero, Sandra; Khoury, Hanane; Goetghebeur, Mireille M; Badia, Xavier

    2017-01-01

    The aim of this study was to adapt and assess the value of a Multi-Criteria Decision Analysis (MCDA) framework (EVIDEM) for the evaluation of Orphan drugs in Catalonia (Catalan Health Service). The standard evaluation and decision-making procedures of CatSalut were compared with the EVIDEM methodology and contents. The EVIDEM framework was adapted to the Catalan context, focusing on the evaluation of Orphan drugs (PASFTAC program), during a Workshop with sixteen PASFTAC members. The criteria weighting was done using two different techniques (nonhierarchical and hierarchical). Reliability was assessed by re-test. The EVIDEM framework and methodology was found useful and feasible for Orphan drugs evaluation and decision making in Catalonia. All the criteria considered for the development of the CatSalut Technical Reports and decision making were considered in the framework. Nevertheless, the framework could improve the reporting of some of these criteria (i.e., "unmet needs" or "nonmedical costs"). Some Contextual criteria were removed (i.e., "Mandate and scope of healthcare system", "Environmental impact") or adapted ("population priorities and access") for CatSalut purposes. Independently of the weighting technique considered, the most important evaluation criteria identified for orphan drugs were: "disease severity", "unmet needs" and "comparative effectiveness", while the "size of the population" had the lowest relevance for decision making. Test-retest analysis showed weight consistency among techniques, supporting reliability overtime. MCDA (EVIDEM framework) could be a useful tool to complement the current evaluation methods of CatSalut, contributing to standardization and pragmatism, providing a method to tackle ethical dilemmas and facilitating discussions related to decision making.

  2. Multimedia-modeling integration development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelton, Mitchell A.; Hoopes, Bonnie L.

    2002-09-02

    There are many framework systems available; however, the purpose of the framework presented here is to capitalize on the successes of the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) and Multi-media Multi-pathway Multi-receptor Risk Assessment (3MRA) methodology as applied to the Hazardous Waste Identification Rule (HWIR) while focusing on the development of software tools to simplify the module developer?s effort of integrating a module into the framework.

  3. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  4. A statistical estimation of Snow Water Equivalent coupling ground data and MODIS images

    NASA Astrophysics Data System (ADS)

    Bavera, D.; Bocchiola, D.; de Michele, C.

    2007-12-01

    The Snow Water Equivalent (SWE) is an important component of the hydrologic balance of mountain basins and snow fed areas in general. The total cumulated snow water equivalent at the end of the accumulation season represents the water availability at melt. Here, a statistical methodology to estimate the Snow Water Equivalent, at April 1st, is developed coupling ground data (snow depth and snow density measurements) and MODIS images. The methodology is applied to the Mallero river basin (about 320 km²) located in the Central Alps, northern Italy, where are available 11 snow gauges and a lot of sparse snow density measurements. The application covers 7 years from 2001 to 2007. The analysis has identified some problems in the MODIS information due to the cloud cover and misclassification for orographic shadow. The study is performed in the framework of AWARE (A tool for monitoring and forecasting Available WAter REsource in mountain environment) EU-project, a STREP Project in the VI F.P., GMES Initiative.

  5. An automatic and effective parameter optimization method for model tuning

    NASA Astrophysics Data System (ADS)

    Zhang, T.; Li, L.; Lin, Y.; Xue, W.; Xie, F.; Xu, H.; Huang, X.

    2015-05-01

    Physical parameterizations in General Circulation Models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determines parameter sensitivity and the other chooses the optimum initial value of sensitive parameters, are introduced before the downhill simplex method to reduce the computational cost and improve the tuning performance. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9%. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameters tuning during the model development stage.

  6. Reconciling statistical and systems science approaches to public health.

    PubMed

    Ip, Edward H; Rahmandad, Hazhir; Shoham, David A; Hammond, Ross; Huang, Terry T-K; Wang, Youfa; Mabry, Patricia L

    2013-10-01

    Although systems science has emerged as a set of innovative approaches to study complex phenomena, many topically focused researchers including clinicians and scientists working in public health are somewhat befuddled by this methodology that at times appears to be radically different from analytic methods, such as statistical modeling, to which the researchers are accustomed. There also appears to be conflicts between complex systems approaches and traditional statistical methodologies, both in terms of their underlying strategies and the languages they use. We argue that the conflicts are resolvable, and the sooner the better for the field. In this article, we show how statistical and systems science approaches can be reconciled, and how together they can advance solutions to complex problems. We do this by comparing the methods within a theoretical framework based on the work of population biologist Richard Levins. We present different types of models as representing different tradeoffs among the four desiderata of generality, realism, fit, and precision.

  7. Reconciling Statistical and Systems Science Approaches to Public Health

    PubMed Central

    Ip, Edward H.; Rahmandad, Hazhir; Shoham, David A.; Hammond, Ross; Huang, Terry T.-K.; Wang, Youfa; Mabry, Patricia L.

    2016-01-01

    Although systems science has emerged as a set of innovative approaches to study complex phenomena, many topically focused researchers including clinicians and scientists working in public health are somewhat befuddled by this methodology that at times appears to be radically different from analytic methods, such as statistical modeling, to which the researchers are accustomed. There also appears to be conflicts between complex systems approaches and traditional statistical methodologies, both in terms of their underlying strategies and the languages they use. We argue that the conflicts are resolvable, and the sooner the better for the field. In this article, we show how statistical and systems science approaches can be reconciled, and how together they can advance solutions to complex problems. We do this by comparing the methods within a theoretical framework based on the work of population biologist Richard Levins. We present different types of models as representing different tradeoffs among the four desiderata of generality, realism, fit, and precision. PMID:24084395

  8. Can environmental impact assessments alone conserve freshwater fish biota? Review of the Chilean experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacy, Shaw Nozaki, E-mail: shaw.lacy@gmail.com; Departmento de Ecosistemas y Medio Ambiente, Pontificia Universidad Católica de Chile, Vicuña Mackenna 4860, Macul; Centro Interdisciplinario de Cambio Global, Vicuña Mackenna 4860, Macul

    Chile was one of many countries that initiated environmental impact assessments in the 1990s, and has relied on their use for species conservation and territorial planning without the use of larger-scale environmental and ecological planning. The capacity of Chile's environmental impact assessment system (SEIA) to evaluate resident freshwater fishes and the potential impacts of water projects and aquaculture activities – two categories of projects that create direct threats to freshwater fishes – are assessed. Of the 3997 such submissions to the SEIA, only 0.6% conducted any freshwater fish assessment, and only 0.1% conducted any quantitative assessment of expected impacts frommore » the associated project. The small number of assessments was characterized by poor study design, inconsistent sampling methodology, and species misidentification. Traditional assessments failed to include freshwater fish ecology in the general assessment framework. The new strategic environmental evaluation system only underscores the need for vastly improved field sampling protocols and assessment methodologies.« less

  9. Making Just Tenure and Promotion Decisions Using the Objective Knowledge Growth Framework

    ERIC Educational Resources Information Center

    Chitpin, Stephanie

    2015-01-01

    Purpose: The purpose of this paper is to utilize the Objective Knowledge Growth Framework (OKGF) to promote a better understanding of the evaluating tenure and promotion processes. Design/Methodology/Approach: A scenario is created to illustrate the concept of using OKGF. Findings: The framework aims to support decision makers in identifying the…

  10. Developing Competence Frameworks in UK Healthcare: Lessons from Practice

    ERIC Educational Resources Information Center

    Mitchell, Lindsay; Boak, George

    2009-01-01

    Purpose: The purpose of this article is to review the use of competence frameworks in the UK healthcare sector and to explore characteristics of the sector that may influence the success of projects to develop new frameworks. Design/methodology/approach: The paper draws on project reports and evaluations of practice in a range of recent projects…

  11. A Conceptual Framework of Corporate and Business Ethics across Organizations: Structures, Processes and Performance

    ERIC Educational Resources Information Center

    Svensson, Goran; Wood, Greg

    2011-01-01

    Purpose: The objective of this paper is to introduce and describe a conceptual framework of corporate and business ethics across organizations in terms of ethical structures, ethical processes and ethical performance. Design/methodology/approach: A framework is outlined and positioned incorporating an ethical frame of reference in the field of…

  12. Constructing the principles: Method and metaphysics in the progress of theoretical physics

    NASA Astrophysics Data System (ADS)

    Glass, Lawrence C.

    This thesis presents a new framework for the philosophy of physics focused on methodological differences found in the practice of modern theoretical physics. The starting point for this investigation is the longstanding debate over scientific realism. Some philosophers have argued that it is the aim of science to produce an accurate description of the world including explanations for observable phenomena. These scientific realists hold that our best confirmed theories are approximately true and that the entities they propose actually populate the world, whether or not they have been observed. Others have argued that science achieves only frameworks for the prediction and manipulation of observable phenomena. These anti-realists argue that truth is a misleading concept when applied to empirical knowledge. Instead, focus should be on the empirical adequacy of scientific theories. This thesis argues that the fundamental distinction at issue, a division between true scientific theories and ones which are empirically adequate, is best explored in terms of methodological differences. In analogy with the realism debate, there are at least two methodological strategies. Rather than focusing on scientific theories as wholes, this thesis takes as units of analysis physical principles which are systematic empirical generalizations. The first possible strategy, the conservative, takes the assumption that the empirical adequacy of a theory in one domain serves as good evidence for such adequacy in other domains. This then motivates the application of the principle to new domains. The second strategy, the innovative, assumes that empirical adequacy in one domain does not justify the expectation of adequacy in other domains. New principles are offered as explanations in the new domain. The final part of the thesis is the application of this framework to two examples. On the first, Lorentz's use of the aether is reconstructed in terms of the conservative strategy with respect to the principles of Galilean relativity. A comparison between the conservative strategy as an application of the conservative strategy and TeVeS as one of the innovative constitutes the second example.

  13. Evolution of 3-D geologic framework modeling and its application to groundwater flow studies

    USGS Publications Warehouse

    Blome, Charles D.; Smith, David V.

    2012-01-01

    In this Fact Sheet, the authors discuss the evolution of project 3-D subsurface framework modeling, research in hydrostratigraphy and airborne geophysics, and methodologies used to link geologic and groundwater flow models.

  14. Auditory Hallucinations and the Brain’s Resting-State Networks: Findings and Methodological Observations

    PubMed Central

    Alderson-Day, Ben; Diederen, Kelly; Fernyhough, Charles; Ford, Judith M.; Horga, Guillermo; Margulies, Daniel S.; McCarthy-Jones, Simon; Northoff, Georg; Shine, James M.; Turner, Jessica; van de Ven, Vincent; van Lutterveld, Remko; Waters, Flavie; Jardri, Renaud

    2016-01-01

    In recent years, there has been increasing interest in the potential for alterations to the brain’s resting-state networks (RSNs) to explain various kinds of psychopathology. RSNs provide an intriguing new explanatory framework for hallucinations, which can occur in different modalities and population groups, but which remain poorly understood. This collaboration from the International Consortium on Hallucination Research (ICHR) reports on the evidence linking resting-state alterations to auditory hallucinations (AH) and provides a critical appraisal of the methodological approaches used in this area. In the report, we describe findings from resting connectivity fMRI in AH (in schizophrenia and nonclinical individuals) and compare them with findings from neurophysiological research, structural MRI, and research on visual hallucinations (VH). In AH, various studies show resting connectivity differences in left-hemisphere auditory and language regions, as well as atypical interaction of the default mode network and RSNs linked to cognitive control and salience. As the latter are also evident in studies of VH, this points to a domain-general mechanism for hallucinations alongside modality-specific changes to RSNs in different sensory regions. However, we also observed high methodological heterogeneity in the current literature, affecting the ability to make clear comparisons between studies. To address this, we provide some methodological recommendations and options for future research on the resting state and hallucinations. PMID:27280452

  15. Speed-Accuracy Tradeoffs in Speech Production

    DTIC Science & Technology

    2017-06-01

    imaging data of speech production. A theoretical framework for considering Fitts’ law in the domain of speech production is elucidated. Methodological ...articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in performing Fitts-style...analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor tasks, defining key

  16. Broadening the Study of Participation in the Life Sciences: How Critical Theoretical and Mixed-Methodological Approaches Can Enhance Efforts to Broaden Participation

    ERIC Educational Resources Information Center

    Metcalf, Heather

    2016-01-01

    This research methods Essay details the usefulness of critical theoretical frameworks and critical mixed-methodological approaches for life sciences education research on broadening participation in the life sciences. First, I draw on multidisciplinary research to discuss critical theory and methodologies. Then, I demonstrate the benefits of these…

  17. Development of risk-based decision methodology for facility design.

    DOT National Transportation Integrated Search

    2014-06-01

    This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...

  18. Technology Assessment for Powertrain Components Final Report CRADA No. TC-1124-95

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tokarz, F.; Gough, C.

    LLNL utilized its defense technology assessment methodologies in combination with its capabilities in the energy; manufacturing, and transportation technologies to demonstrate a methodology that synthesized available but incomplete information on advanced automotive technologies into a comprehensive framework.

  19. Health systems and noncommunicable diseases in the Asia-Pacific region: a review of the published literature.

    PubMed

    Mannava, Priya; Abdullah, Asnawi; James, Chris; Dodd, Rebecca; Annear, Peter Leslie

    2015-03-01

    Addressing the growing burden of noncommunicable diseases (NCDs) in countries of the Asia-Pacific region requires well-functioning health systems. In low- and middle-income countries (LMICs), however, health systems are generally characterized by inadequate financial and human resources, unsuitable service delivery models, and weak information systems. The aims of this review were to identify (a) health systems interventions being implemented to deliver NCD programs and services and their outcomes and (b) the health systems bottlenecks impeding access to or delivery of these programs and services in LMICs of the Asia-Pacific region. A search of 4 databases for literature published between 1990 and 2010 retrieved 36 relevant studies. For each study, information on basic characteristics, type of health systems bottleneck/intervention, and outcome was extracted, and methodological quality appraised. Health systems interventions and bottlenecks were classified as per the World Health Organization health systems building blocks framework. The review identified interventions and bottlenecks in the building blocks of service delivery, health workforce, financing, health information systems, and medical products, vaccines, and technologies. Studies, however, were heterogeneous in methodologies used, and the overall quality was generally low. There are several gaps in the evidence base around NCDs in the Asia-Pacific region that require further investigation. © 2013 APJPH.

  20. Ultrafast Ultrasound Imaging Using Combined Transmissions With Cross-Coherence-Based Reconstruction.

    PubMed

    Zhang, Yang; Guo, Yuexin; Lee, Wei-Ning

    2018-02-01

    Plane-wave-based ultrafast imaging has become the prevalent technique for non-conventional ultrasound imaging. The image quality, especially in terms of the suppression of artifacts, is generally compromised by reducing the number of transmissions for a higher frame rate. We hereby propose a new ultrafast imaging framework that reduces not only the side lobe artifacts but also the axial lobe artifacts using combined transmissions with a new coherence-based factor. The results from simulations, in vitro wire phantoms, the ex vivo porcine artery, and the in vivo porcine heart show that our proposed methodology greatly reduced the axial lobe artifact by 25±5 dB compared with coherent plane-wave compounding (CPWC), which was considered as the ultrafast imaging standard, and suppressed side lobe artifacts by 15 ± 5 dB compared with CPWC and coherent spherical-wave compounding. The reduction of artifacts in our proposed ultrafast imaging framework led to a better boundary delineation of soft tissues than CPWC.

  1. a Framework for Architectural Heritage Hbim Semantization and Development

    NASA Astrophysics Data System (ADS)

    Brusaporci, S.; Maiezza, P.; Tata, A.

    2018-05-01

    Despite the recognized advantages of the use of BIM in the field of architecture and engineering, the extension of this procedure to the architectural heritage is neither immediate nor critical. The uniqueness and irregularity of historical architecture, on the one hand, and the great quantity of information necessary for the knowledge of architectural heritage, on the other, require appropriate reflections. The aim of this paper is to define a general framework for the use of BIM procedures for architectural heritage. The proposed methodology consists of three different Level of Development (LoD), depending on the characteristics of the building and the objectives of the study: a simplified model with a low geometric accuracy and a minimum quantity of information (LoD 200); a model nearer to the reality but, however, with a high deviation between virtual and real model (LoD 300); a detailed BIM model that reproduce as much as possible the geometric irregularities of the building and is enriched by the maximum quantity of information available (LoD 400).

  2. 3D virtual character reconstruction from projections: a NURBS-based approach

    NASA Astrophysics Data System (ADS)

    Triki, Olfa; Zaharia, Titus B.; Preteux, Francoise J.

    2004-05-01

    This work has been carried out within the framework of the industrial project, so-called TOON, supported by the French government. TOON aims at developing tools for automating the traditional 2D cartoon content production. This paper presents preliminary results of the TOON platform. The proposed methodology concerns the issues of 2D/3D reconstruction from a limited number of drawn projections, and 2D/3D manipulation/deformation/refinement of virtual characters. Specifically, we show that the NURBS-based modeling approach developed here offers a well-suited framework for generating deformable 3D virtual characters from incomplete 2D information. Furthermore, crucial functionalities such as animation and non-rigid deformation can be also efficiently handled and solved. Note that user interaction is enabled exclusively in 2D by achieving a multiview constraint specification method. This is fully consistent and compliant with the cartoon creator traditional practice and makes it possible to avoid the use of 3D modeling software packages which are generally complex to manipulate.

  3. Planning an organizational wellness initiative at a multi-state social service agency.

    PubMed

    Miller, J Jay; Grise-Owens, Erlene; Addison, Donia; Marshall, Midaya; Trabue, Donna; Escobar-Ratliff, Laura

    2016-06-01

    Increasingly, organizations in general, and social service organizations, specifically, are recognizing the importance of planning and evaluating organizational wellness initiatives. Yet, few participatory models for carrying out these aims exist. For this study, researchers utilized concept mapping (CM) to explicate a conceptual framework for planning, and subsequently evaluating, a wellness initiative at a multi-state social service organization. CM is a participatory approach that analyzes qualitative data via multi-dimensional scaling and hierarchical cluster analyses. Outputs include a number of visual depictions that allow researchers to explore complex relationships among sets of the data. Results from this study indicated that participants (N=64), all of whom were employees of the agency, conceptualized organizational wellness via an eight-cluster solution, or Concept Map. Priority areas of this framework, specifically importance and feasibility, were also explored. After a brief review of pertinent literature, this article explicates the CM methodology utilized in this study, describes results, discusses lessons learned, and identifies apt areas for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Toward a Predictive Framework for Convergent Evolution: Integrating Natural History, Genetic Mechanisms, and Consequences for the Diversity of Life.

    PubMed

    Agrawal, Anurag A

    2017-08-01

    A charm of biology as a scientific discipline is the diversity of life. Although this diversity can make laws of biology challenging to discover, several repeated patterns and general principles govern evolutionary diversification. Convergent evolution, the independent evolution of similar phenotypes, has been at the heart of one approach to understand generality in the evolutionary process. Yet understanding when and why organismal traits and strategies repeatedly evolve has been a central challenge. These issues were the focus of the American Society of Naturalists Vice Presidential Symposium in 2016 and are the subject of this collection of articles. Although naturalists have long made inferences about convergent evolution and its importance, there has been confusion in the interpretation of the pattern of convergence. Does convergence primarily indicate adaptation or constraint? How often should convergence be expected? Are there general principles that would allow us to predict where and when and by what mechanisms convergent evolution should occur? What role does natural history play in advancing our understanding of general evolutionary principles? In this introductory article, I address these questions, review several generalizations about convergent evolution that have emerged over the past 15 years, and present a framework for advancing the study and interpretation of convergence. Perhaps the most important emerging conclusion is that the genetic mechanisms of convergent evolution are phylogenetically conserved; that is, more closely related species tend to share the same genetic basis of traits, even when independently evolved. Finally, I highlight how the articles in this special issue further develop concepts, methodologies, and case studies at the frontier of our understanding of the causes and consequences of convergent evolution.

  5. Estimating life expectancies for US small areas: a regression framework

    NASA Astrophysics Data System (ADS)

    Congdon, Peter

    2014-01-01

    Analysis of area mortality variations and estimation of area life tables raise methodological questions relevant to assessing spatial clustering, and socioeconomic inequalities in mortality. Existing small area analyses of US life expectancy variation generally adopt ad hoc amalgamations of counties to alleviate potential instability of mortality rates involved in deriving life tables, and use conventional life table analysis which takes no account of correlated mortality for adjacent areas or ages. The alternative strategy here uses structured random effects methods that recognize correlations between adjacent ages and areas, and allows retention of the original county boundaries. This strategy generalizes to include effects of area category (e.g. poverty status, ethnic mix), allowing estimation of life tables according to area category, and providing additional stabilization of estimated life table functions. This approach is used here to estimate stabilized mortality rates, derive life expectancies in US counties, and assess trends in clustering and in inequality according to county poverty category.

  6. Functional Foods and Nutraceuticals in a Market of Bolivian Immigrants in Buenos Aires (Argentina)

    PubMed Central

    Pochettino, María Lelia; Puentes, Jeremías P.; Buet Costantino, Fernando; Arenas, Patricia M.; Ulibarri, Emilio A.; Hurrell, Julio A.

    2012-01-01

    This paper presents the results of a research in urban ethnobotany, conducted in a market of Bolivian immigrants in the neighborhood of Liniers, Ciudad Autónoma de Buenos Aires (Argentina). Functional foods and nutraceuticals belonging to 50 species of 18 families, its products, and uses were recorded. Some products are exclusive from the Bolivian community; others are frequent within the community, but they are also available in the general commercial circuit; they are introduced into it, generally, through shops called dietéticas (“health-food stores”), where products associated with the maintenance of health are sold. On this basis, the traditional and nontraditional components of the urban botanical knowledge were evaluated as well as its dynamics in relation to the diffusion of the products. Both the framework and methodological design are innovative for the studies of the urban botanical knowledge and the traditional markets in metropolitan areas. PMID:22203866

  7. Time-varying Concurrent Risk of Extreme Droughts and Heatwaves in California

    NASA Astrophysics Data System (ADS)

    Sarhadi, A.; Diffenbaugh, N. S.; Ausin, M. C.

    2016-12-01

    Anthropogenic global warming has changed the nature and the risk of extreme climate phenomena such as droughts and heatwaves. The concurrent of these nature-changing climatic extremes may result in intensifying undesirable consequences in terms of human health and destructive effects in water resources. The present study assesses the risk of concurrent extreme droughts and heatwaves under dynamic nonstationary conditions arising from climate change in California. For doing so, a generalized fully Bayesian time-varying multivariate risk framework is proposed evolving through time under dynamic human-induced environment. In this methodology, an extreme, Bayesian, dynamic copula (Gumbel) is developed to model the time-varying dependence structure between the two different climate extremes. The time-varying extreme marginals are previously modeled using a Generalized Extreme Value (GEV) distribution. Bayesian Markov Chain Monte Carlo (MCMC) inference is integrated to estimate parameters of the nonstationary marginals and copula using a Gibbs sampling method. Modelled marginals and copula are then used to develop a fully Bayesian, time-varying joint return period concept for the estimation of concurrent risk. Here we argue that climate change has increased the chance of concurrent droughts and heatwaves over decades in California. It is also demonstrated that a time-varying multivariate perspective should be incorporated to assess realistic concurrent risk of the extremes for water resources planning and management in a changing climate in this area. The proposed generalized methodology can be applied for other stochastic nature-changing compound climate extremes that are under the influence of climate change.

  8. Evaluation of the Majorana phases of a general Majorana neutrino mass matrix: Testability of hierarchical flavour models

    NASA Astrophysics Data System (ADS)

    Samanta, Rome; Chakraborty, Mainak; Ghosal, Ambar

    2016-03-01

    We evaluate the Majorana phases for a general 3 × 3 complex symmetric neutrino mass matrix on the basis of Mohapatra-Rodejohann's phase convention using the three rephasing invariant quantities I12, I13 and I23 proposed by Sarkar and Singh. We find them interesting as they allow us to evaluate each Majorana phase in a model independent way even if one eigenvalue is zero. Utilizing the solution of a general complex symmetric mass matrix for eigenvalues and mixing angles we determine the Majorana phases for both the hierarchies, normal and inverted, taking into account the constraints from neutrino oscillation global fit data as well as bound on the sum of the three light neutrino masses (Σimi) and the neutrinoless double beta decay (ββ0ν) parameter |m11 |. This methodology of finding the Majorana phases is applied thereafter in some predictive models for both the hierarchical cases (normal and inverted) to evaluate the corresponding Majorana phases and it is shown that all the sub cases presented in inverted hierarchy section can be realized in a model with texture zeros and scaling ansatz within the framework of inverse seesaw although one of the sub cases following the normal hierarchy is yet to be established. Except the case of quasi degenerate neutrinos, the methodology obtained in this work is able to evaluate the corresponding Majorana phases, given any model of neutrino masses.

  9. Estimating the palliative effect of percutaneous endoscopic gastrostomy in an observational registry using principal stratification and generalized propensity scores

    PubMed Central

    Mishra-Kalyani, Pallavi S.; Johnson, Brent A.; Glass, Jonathan D.; Long, Qi

    2016-01-01

    Clinical disease registries offer a rich collection of valuable patient information but also pose challenges that require special care and attention in statistical analyses. The goal of this paper is to propose a statistical framework that allows for estimating the effect of surgical insertion of a percutaneous endogastrostomy (PEG) tube for patients living with amyotrophic lateral sclerosis (ALS) using data from a clinical registry. Although all ALS patients are informed about PEG, only some patients agree to the procedure which, leads to the potential for selection bias. Assessing the effect of PEG is further complicated by the aggressively fatal disease, such that time to death competes directly with both the opportunity to receive PEG and clinical outcome measurements. Our proposed methodology handles the “censoring by death” phenomenon through principal stratification and selection bias for PEG treatment through generalized propensity scores. We develop a fully Bayesian modeling approach to estimate the survivor average causal effect (SACE) of PEG on BMI, a surrogate outcome measure of nutrition and quality of life. The use of propensity score methods within the principal stratification framework demonstrates a significant and positive effect of PEG treatment, particularly when time of treatment is included in the treatment definition. PMID:27640365

  10. Estimating the palliative effect of percutaneous endoscopic gastrostomy in an observational registry using principal stratification and generalized propensity scores

    NASA Astrophysics Data System (ADS)

    Mishra-Kalyani, Pallavi S.; Johnson, Brent A.; Glass, Jonathan D.; Long, Qi

    2016-09-01

    Clinical disease registries offer a rich collection of valuable patient information but also pose challenges that require special care and attention in statistical analyses. The goal of this paper is to propose a statistical framework that allows for estimating the effect of surgical insertion of a percutaneous endogastrostomy (PEG) tube for patients living with amyotrophic lateral sclerosis (ALS) using data from a clinical registry. Although all ALS patients are informed about PEG, only some patients agree to the procedure which, leads to the potential for selection bias. Assessing the effect of PEG is further complicated by the aggressively fatal disease, such that time to death competes directly with both the opportunity to receive PEG and clinical outcome measurements. Our proposed methodology handles the “censoring by death” phenomenon through principal stratification and selection bias for PEG treatment through generalized propensity scores. We develop a fully Bayesian modeling approach to estimate the survivor average causal effect (SACE) of PEG on BMI, a surrogate outcome measure of nutrition and quality of life. The use of propensity score methods within the principal stratification framework demonstrates a significant and positive effect of PEG treatment, particularly when time of treatment is included in the treatment definition.

  11. Hypersonic Shock Wave Computations Using the Generalized Boltzmann Equation

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh; Chen, Rui; Cheremisin, Felix G.

    2006-11-01

    Hypersonic shock structure in diatomic gases is computed by solving the Generalized Boltzmann Equation (GBE), where the internal and translational degrees of freedom are considered in the framework of quantum and classical mechanics respectively [1]. The computational framework available for the standard Boltzmann equation [2] is extended by including both the rotational and vibrational degrees of freedom in the GBE. There are two main difficulties encountered in computation of high Mach number flows of diatomic gases with internal degrees of freedom: (1) a large velocity domain is needed for accurate numerical description of the distribution function resulting in enormous computational effort in calculation of the collision integral, and (2) about 50 energy levels are needed for accurate representation of the rotational spectrum of the gas. Our methodology addresses these problems, and as a result the efficiency of calculations has increased by several orders of magnitude. The code has been validated by computing the shock structure in Nitrogen for Mach numbers up to 25 including the translational and rotational degrees of freedom. [1] Beylich, A., ``An Interlaced System for Nitrogen Gas,'' Proc. of CECAM Workshop, ENS de Lyon, France, 2000. [2] Cheremisin, F., ``Solution of the Boltzmann Kinetic Equation for High Speed Flows of a Rarefied Gas,'' Proc. of the 24th Int. Symp. on Rarefied Gas Dynamics, Bari, Italy, 2004.

  12. Development and Application of a Systems Engineering Framework to Support Online Course Design and Delivery

    ERIC Educational Resources Information Center

    Bozkurt, Ipek; Helm, James

    2013-01-01

    This paper develops a systems engineering-based framework to assist in the design of an online engineering course. Specifically, the purpose of the framework is to provide a structured methodology for the design, development and delivery of a fully online course, either brand new or modified from an existing face-to-face course. The main strength…

  13. Meta-Synthetic Support Frameworks for Reuse of Government Information Resources on City Travel and Traffic: The Case of Beijing

    ERIC Educational Resources Information Center

    An, Xiaomi; Xu, Shaotong; Mu, Yong; Wang, Wei; Bai, Xian Yang; Dawson, Andy; Han, Hongqi

    2012-01-01

    Purpose: The purpose of this paper is to propose meta-synthetic ideas and knowledge asset management approaches to build a comprehensive strategic framework for Beijing City in China. Design/methodology/approach: Methods include a review of relevant literature in both English and Chinese, case studies of different types of support frameworks in…

  14. Research design: the methodology for interdisciplinary research framework.

    PubMed

    Tobi, Hilde; Kampen, Jarl K

    2018-01-01

    Many of today's global scientific challenges require the joint involvement of researchers from different disciplinary backgrounds (social sciences, environmental sciences, climatology, medicine, etc.). Such interdisciplinary research teams face many challenges resulting from differences in training and scientific culture. Interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences. For that purpose this paper presents the Methodology for Interdisciplinary Research (MIR) framework. The MIR framework was developed to help cross disciplinary borders, especially those between the natural sciences and the social sciences. The framework has been specifically constructed to facilitate the design of interdisciplinary scientific research, and can be applied in an educational program, as a reference for monitoring the phases of interdisciplinary research, and as a tool to design such research in a process approach. It is suitable for research projects of different sizes and levels of complexity, and it allows for a range of methods' combinations (case study, mixed methods, etc.). The different phases of designing interdisciplinary research in the MIR framework are described and illustrated by real-life applications in teaching and research. We further discuss the framework's utility in research design in landscape architecture, mixed methods research, and provide an outlook to the framework's potential in inclusive interdisciplinary research, and last but not least, research integrity.

  15. Eigenspace perturbations for uncertainty estimation of single-point turbulence closures

    NASA Astrophysics Data System (ADS)

    Iaccarino, Gianluca; Mishra, Aashwin Ananda; Ghili, Saman

    2017-02-01

    Reynolds-averaged Navier-Stokes (RANS) models represent the workhorse for predicting turbulent flows in complex industrial applications. However, RANS closures introduce a significant degree of epistemic uncertainty in predictions due to the potential lack of validity of the assumptions utilized in model formulation. Estimating this uncertainty is a fundamental requirement for building confidence in such predictions. We outline a methodology to estimate this structural uncertainty, incorporating perturbations to the eigenvalues and the eigenvectors of the modeled Reynolds stress tensor. The mathematical foundations of this framework are derived and explicated. Thence, this framework is applied to a set of separated turbulent flows, while compared to numerical and experimental data and contrasted against the predictions of the eigenvalue-only perturbation methodology. It is exhibited that for separated flows, this framework is able to yield significant enhancement over the established eigenvalue perturbation methodology in explaining the discrepancy against experimental observations and high-fidelity simulations. Furthermore, uncertainty bounds of potential engineering utility can be estimated by performing five specific RANS simulations, reducing the computational expenditure on such an exercise.

  16. A system-of-systems modeling methodology for strategic general aviation design decision-making

    NASA Astrophysics Data System (ADS)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting independently. Implementation of this methodology can afford engineers a more autonomous perspective in the concept exploration process, providing dynamic feedback about a design's potential success in specific market segments. The method also has potential to strengthen the connection between design and business departments, as well as between manufacturers, service providers, and infrastructure planners---bringing information about how the respective systems interact, and what might be done to improve synergism of systems.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehlen, Mark Andrew; Vugrin, Eric D.; Warren, Drake E.

    In recent years, the nation has recognized that critical infrastructure protection should consider not only the prevention of disruptive events, but also the processes that infrastructure systems undergo to maintain functionality following disruptions. This more comprehensive approach has been termed critical infrastructure resilience (CIR). Given the occurrence of a particular disruptive event, the resilience of a system to that event is the system's ability to efficiently reduce both the magnitude and duration of the deviation from targeted system performance levels. Sandia National Laboratories (Sandia) has developed a comprehensive resilience assessment framework for evaluating the resilience of infrastructure and economic systems.more » The framework includes a quantitative methodology that measures resilience costs that result from a disruption to infrastructure function. The framework also includes a qualitative analysis methodology that assesses system characteristics that affect resilience in order to provide insight and direction for potential improvements to resilience. This paper describes the resilience assessment framework. This paper further demonstrates the utility of the assessment framework through application to a hypothetical scenario involving the disruption of a petrochemical supply chain by a hurricane.« less

  18. Mapping of Drug-like Chemical Universe with Reduced Complexity Molecular Frameworks.

    PubMed

    Kontijevskis, Aleksejs

    2017-04-24

    The emergence of the DNA-encoded chemical libraries (DEL) field in the past decade has attracted the attention of the pharmaceutical industry as a powerful mechanism for the discovery of novel drug-like hits for various biological targets. Nuevolution Chemetics technology enables DNA-encoded synthesis of billions of chemically diverse drug-like small molecule compounds, and the efficient screening and optimization of these, facilitating effective identification of drug candidates at an unprecedented speed and scale. Although many approaches have been developed by the cheminformatics community for the analysis and visualization of drug-like chemical space, most of them are restricted to the analysis of a maximum of a few millions of compounds and cannot handle collections of 10 8 -10 12 compounds typical for DELs. To address this big chemical data challenge, we developed the Reduced Complexity Molecular Frameworks (RCMF) methodology as an abstract and very general way of representing chemical structures. By further introducing RCMF descriptors, we constructed a global framework map of drug-like chemical space and demonstrated how chemical space occupied by multi-million-member drug-like Chemetics DNA-encoded libraries and virtual combinatorial libraries with >10 12 members could be analyzed and mapped without a need for library enumeration. We further validate the approach by performing RCMF-based searches in a drug-like chemical universe and mapping Chemetics library selection outputs for LSD1 targets on a global framework chemical space map.

  19. A hybridized discontinuous Galerkin framework for high-order particle-mesh operator splitting of the incompressible Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Maljaars, Jakob M.; Labeur, Robert Jan; Möller, Matthias

    2018-04-01

    A generic particle-mesh method using a hybridized discontinuous Galerkin (HDG) framework is presented and validated for the solution of the incompressible Navier-Stokes equations. Building upon particle-in-cell concepts, the method is formulated in terms of an operator splitting technique in which Lagrangian particles are used to discretize an advection operator, and an Eulerian mesh-based HDG method is employed for the constitutive modeling to account for the inter-particle interactions. Key to the method is the variational framework provided by the HDG method. This allows to formulate the projections between the Lagrangian particle space and the Eulerian finite element space in terms of local (i.e. cellwise) ℓ2-projections efficiently. Furthermore, exploiting the HDG framework for solving the constitutive equations results in velocity fields which excellently approach the incompressibility constraint in a local sense. By advecting the particles through these velocity fields, the particle distribution remains uniform over time, obviating the need for additional quality control. The presented methodology allows for a straightforward extension to arbitrary-order spatial accuracy on general meshes. A range of numerical examples shows that optimal convergence rates are obtained in space and, given the particular time stepping strategy, second-order accuracy is obtained in time. The model capabilities are further demonstrated by presenting results for the flow over a backward facing step and for the flow around a cylinder.

  20. Facility Energy Performance Benchmarking in a Data-Scarce Environment

    DTIC Science & Technology

    2017-08-01

    environment, and analyze occupant-, system-, and component-level faults contributing to energy in- efficiency. A methodology for developing DoD-specific...Research, Development, Test, and Evaluation (RDTE) Program to develop an intelligent framework, encompassing methodology and model- ing, that...energy performers by installation, climate zone, and other criteria. A methodology for creating the DoD-specific EUIs would be an important part of a

  1. Letter to the editor regarding "GRAS from the ground up: Review of the Interim Pilot Program for GRAS notification" by.

    PubMed

    Sewalt, Vincent; LaMarta, James; Shanahan, Diane; Gregg, Lori; Carrillo, Roberto

    2017-09-01

    Present letter is aimed at clarifying some critical points highlighted by Hanlon et al. regarding the common knowledge element of the safety of food enzymes in support of their GRAS designation. Particularly, we outline the development of peer-reviewed, generally recognized safety evaluation methodology for microbial enzymes and its adoption by the enzyme industry, which provides the US FDA with a review framework for enzyme GRAS Notices. This approach may serve as a model to other food ingredient categories for a scientifically sound, rigorous, and transparent application of the GRAS concept. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. General Multivariate Linear Modeling of Surface Shapes Using SurfStat

    PubMed Central

    Chung, Moo K.; Worsley, Keith J.; Nacewicz, Brendon, M.; Dalton, Kim M.; Davidson, Richard J.

    2010-01-01

    Although there are many imaging studies on traditional ROI-based amygdala volumetry, there are very few studies on modeling amygdala shape variations. This paper present a unified computational and statistical framework for modeling amygdala shape variations in a clinical population. The weighted spherical harmonic representation is used as to parameterize, to smooth out, and to normalize amygdala surfaces. The representation is subsequently used as an input for multivariate linear models accounting for nuisance covariates such as age and brain size difference using SurfStat package that completely avoids the complexity of specifying design matrices. The methodology has been applied for quantifying abnormal local amygdala shape variations in 22 high functioning autistic subjects. PMID:20620211

  3. Video observations of sensitive caregiving "off the beaten track": introduction to the special issue.

    PubMed

    Mesman, Judi

    2018-03-22

    This introduction to the special issue on video observations of sensitive caregiving in different cultural communities provides a general theoretical and methodological framework for the seven empirical studies that are at the heart of this special issue. It highlights the cross-cultural potential of the sensitivity construct, the importance of research on sensitivity "off the beaten track," the advantages and potential challenges of the use of video in diverse cultural contexts, and the benefits of forming research teams that include local scholars. The paper concludes with an overview of the seven empirical studies of sensitivity in this special issue with video observations from Brazil, Indonesia, Iran, Kenya, Peru, South Africa, and Yemen.

  4. Comparison and Contrast of Two General Functional Regression Modeling Frameworks

    PubMed Central

    Morris, Jeffrey S.

    2017-01-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable. PMID:28736502

  5. Comparison and Contrast of Two General Functional Regression Modeling Frameworks.

    PubMed

    Morris, Jeffrey S

    2017-02-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable.

  6. The Contingency Theory of Education.

    ERIC Educational Resources Information Center

    Goodnow, Wilma Elizabeth

    1982-01-01

    Develops a conceptual framework for determining the appropriateness of various methodologies. Concludes that educators should stop switching from one to another and recognize that the best methodology is contingent upon the circumstances. (Falmer Press, Falmer House, Barcombe, Nr Lewes, East Sussex, BN8 5DL, UK) (JOW)

  7. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    PubMed

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2018-05-01

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  8. Assessing the economic benefits of vaccines based on the health investment life course framework: a review of a broader approach to evaluate malaria vaccination.

    PubMed

    Constenla, Dagna

    2015-03-24

    Economic evaluations have routinely understated the net benefits of vaccination by not including the full range of economic benefits that accrue over the lifetime of a vaccinated person. Broader approaches for evaluating benefits of vaccination can be used to more accurately calculate the value of vaccination. This paper reflects on the methodology of one such approach - the health investment life course approach - that looks at the impact of vaccine investment on lifetime returns. The role of this approach on vaccine decision-making will be assessed using the malaria health investment life course model example. We describe a framework that measures the impact of a health policy decision on government accounts over many generations. The methodological issues emerging from this approach are illustrated with an example from a recently completed health investment life course analysis of malaria vaccination in Ghana. Beyond the results, various conceptual and practical challenges of applying this framework to Ghana are discussed in this paper. The current framework seeks to understand how disease and available technologies can impact a range of economic parameters such as labour force participation, education, healthcare consumption, productivity, wages or economic growth, and taxation following their introduction. The framework is unique amongst previous economic models in malaria because it considers future tax revenue for governments. The framework is complementary to cost-effectiveness and budget impact analysis. The intent of this paper is to stimulate discussion on how existing and new methodology can add to knowledge regarding the benefits from investing in new and underutilized vaccines. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. A Review of Research on Driving Styles and Road Safety.

    PubMed

    Sagberg, Fridulv; Selpi; Piccinini, Giulio Francesco Bianchi; Engström, Johan

    2015-11-01

    The aim of this study was to outline a conceptual framework for understanding driving style and, on this basis, review the state-of-the-art research on driving styles in relation to road safety. Previous research has indicated a relationship between the driving styles adopted by drivers and their crash involvement. However, a comprehensive literature review of driving style research is lacking. A systematic literature search was conducted, including empirical, theoretical, and methodological research, on driving styles related to road safety. A conceptual framework was proposed whereby driving styles are viewed in terms of driving habits established as a result of individual dispositions as well as social norms and cultural values. Moreover, a general scheme for categorizing and operationalizing driving styles was suggested. On this basis, existing literature on driving styles and indicators was reviewed. Links between driving styles and road safety were identified and individual and sociocultural factors influencing driving style were reviewed. Existing studies have addressed a wide variety of driving styles, and there is an acute need for a unifying conceptual framework in order to synthesize these results and make useful generalizations. There is a considerable potential for increasing road safety by means of behavior modification. Naturalistic driving observations represent particularly promising approaches to future research on driving styles. Knowledge about driving styles can be applied in programs for modifying driver behavior and in the context of usage-based insurance. It may also be used as a means for driver identification and for the development of driver assistance systems. © 2015, Human Factors and Ergonomics Society.

  10. Making sense of medically unexplained symptoms in general practice: a grounded theory study

    PubMed Central

    2013-01-01

    Background General practitioners often encounter patients with medically unexplained symptoms. These patients share many common features, but there is little agreement about the best diagnostic framework for describing them. Aims This study aimed to explore how GPs make sense of medically unexplained symptoms. Design Semi-structured interviews were conducted with 24 GPs. Each participant was asked to describe a patient with medically unexplained symptoms and discuss their assessment and management. Setting The study was conducted among GPs from teaching practices across Australia. Methods Participants were selected by purposive sampling and all interviews were transcribed. Iterative analysis was undertaken using constructivist grounded theory methodology. Results GPs used a variety of frameworks to understand and manage patients with medically unexplained symptoms. They used different frameworks to reason, to help patients make sense of their suffering, and to communicate with other health professionals. GPs tried to avoid using stigmatising labels such as ‘borderline personality disorder’, which were seen to apply a ‘layer of dismissal’ to patients. They worried about missing serious physical disease, but managed the risk by deliberately attending to physical cues during some consultations, and focusing on coping with medically unexplained symptoms in others. They also used referrals to exclude serious disease, but were wary of triggering a harmful cycle of uncoordinated care. Conclusion GPs were aware of the ethical relevance of psychiatric diagnoses, and attempted to protect their patients from stigma. They crafted helpful explanatory narratives for patients that shaped their experience of suffering. Disease surveillance remained an important role for GPs who were managing medically unexplained symptoms. PMID:24427176

  11. Extraction of fetal ECG signal by an improved method using extended Kalman smoother framework from single channel abdominal ECG signal.

    PubMed

    Panigrahy, D; Sahu, P K

    2017-03-01

    This paper proposes a five-stage based methodology to extract the fetal electrocardiogram (FECG) from the single channel abdominal ECG using differential evolution (DE) algorithm, extended Kalman smoother (EKS) and adaptive neuro fuzzy inference system (ANFIS) framework. The heart rate of the fetus can easily be detected after estimation of the fetal ECG signal. The abdominal ECG signal contains fetal ECG signal, maternal ECG component, and noise. To estimate the fetal ECG signal from the abdominal ECG signal, removal of the noise and the maternal ECG component presented in it is necessary. The pre-processing stage is used to remove the noise from the abdominal ECG signal. The EKS framework is used to estimate the maternal ECG signal from the abdominal ECG signal. The optimized parameters of the maternal ECG components are required to develop the state and measurement equation of the EKS framework. These optimized maternal ECG parameters are selected by the differential evolution algorithm. The relationship between the maternal ECG signal and the available maternal ECG component in the abdominal ECG signal is nonlinear. To estimate the actual maternal ECG component present in the abdominal ECG signal and also to recognize this nonlinear relationship the ANFIS is used. Inputs to the ANFIS framework are the output of EKS and the pre-processed abdominal ECG signal. The fetal ECG signal is computed by subtracting the output of ANFIS from the pre-processed abdominal ECG signal. Non-invasive fetal ECG database and set A of 2013 physionet/computing in cardiology challenge database (PCDB) are used for validation of the proposed methodology. The proposed methodology shows a sensitivity of 94.21%, accuracy of 90.66%, and positive predictive value of 96.05% from the non-invasive fetal ECG database. The proposed methodology also shows a sensitivity of 91.47%, accuracy of 84.89%, and positive predictive value of 92.18% from the set A of PCDB.

  12. Towards the Implementation of an Assessment-Centred Blended Learning Framework at the Course Level: A Case Study in a Vietnamese National University

    ERIC Educational Resources Information Center

    Nguyen, Viet Anh

    2017-01-01

    Purpose: The purpose of this paper is to build an assessment-centred blended learning (BL) framework to assess learners, to analyse and to evaluate the impact of the technology support in the form of formative assessment in students' positive learning. Design/methodology/approach: This research proposed an assessment-centred BL framework at the…

  13. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  14. Process synthesis involving multi-period operations by the P-graph framework

    EPA Science Inventory

    The P-graph (process graph) framework is an effective tool for process-network synthesis (PNS). Here we extended it to multi-period operations. The efficacy of the P-graph methodology has been demonstrated by numerous applications. The unambiguous representation of processes and ...

  15. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  16. Towards a more holistic sustainability assessment framework for agro-bioenergy systems — A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arodudu, Oludunsin, E-mail: Oludunsin.Arodudu@zalf.de; Potsdam University, Institute of Earth and Environmental Sciences, Karl-Liebknecht-Straße 24-25, 14476 Potsdam, Golm; Helming, Katharina

    The use of life cycle assessment (LCA) as a sustainability assessment tool for agro-bioenergy system usually has an industrial agriculture bias. Furthermore, LCA generally has often been criticized for being a decision maker tool which may not consider decision takers perceptions. They are lacking in spatial and temporal depth, and unable to assess sufficiently some environmental impact categories such as biodiversity, land use etc. and most economic and social impact categories, e.g. food security, water security, energy security. This study explored tools, methodologies and frameworks that can be deployed individually, as well as in combination with each other for bridgingmore » these methodological gaps in application to agro-bioenergy systems. Integrating agronomic options, e.g. alternative farm power, tillage, seed sowing options, fertilizer, pesticide, irrigation into the boundaries of LCAs for agro-bioenergy systems will not only provide an alternative agro-ecological perspective to previous LCAs, but will also lead to the derivation of indicators for assessment of some social and economic impact categories. Deploying life cycle thinking approaches such as energy return on energy invested-EROEI, human appropriation of net primary production-HANPP, net greenhouse gas or carbon balance-NCB, water footprint individually and in combination with each other will also lead to further derivation of indicators suitable for assessing relevant environmental, social and economic impact categories. Also, applying spatio-temporal simulation models has a potential for improving the spatial and temporal depths of LCA analysis.« less

  17. Toward a Framework for Comparative HRD Research

    ERIC Educational Resources Information Center

    Wang, Greg G.; Sun, Judy Y.

    2012-01-01

    Purpose: This paper seeks to address the recent challenges in the international human resource development (HRD) research and the related methodological strategy. Design/methodology/approach: This inquiry is based on a survey of literatures and integrates various comparative research strategies adopted in other major social science disciplines.…

  18. Magnitude and variability of land evaporation and its components at the global scale

    USDA-ARS?s Scientific Manuscript database

    A physics-based methodology is applied to estimate global land-surface evaporation from multi-satellite observations. GLEAM (Global Land-surface Evaporation: the Amsterdam Methodology) combines a wide range of remotely sensed observations within a Priestley and Taylor-based framework. Daily actual e...

  19. Feminist methodologies and engineering education research

    NASA Astrophysics Data System (ADS)

    Beddoes, Kacey

    2013-03-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.

  20. Force on Force Modeling with Formal Task Structures and Dynamic Geometry

    DTIC Science & Technology

    2017-03-24

    task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test

  1. Towards more sustainable management of European food waste: Methodological approach and numerical application.

    PubMed

    Manfredi, Simone; Cristobal, Jorge

    2016-09-01

    Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.

  2. Power-sharing Partnerships: Teachers' Experiences of Participatory Methodology.

    PubMed

    Ferreira, Ronél; Ebersöhn, Liesel; Mbongwe, Bathsheba B

    2015-01-01

    This article reports on the experiences of teachers as coresearchers in a long-term partnership with university researchers, who participated in an asset-based intervention project known as Supportive Teachers, Assets and Resilience (STAR). In an attempt to inform participatory research methodology, the study investigated how coresearchers (teachers) experienced power relations. We utilized Gaventa's power cube as a theoretical framework and participatory research as our methodologic paradigm. Ten teachers of a primary school in the Eastern Cape and five teachers of a secondary school in a remote area in the Mpumalanga Province in South Africa participated (n=15). We employed multiple data generation techniques, namely Participatory Reflection and Action (PRA) activities, observation, focus group discussions, and semistructured interviews, using thematic analysis and categorical aggregation for data analysis. We identified three themes, related to the (1) nature of power in participatory partnerships, (2) coreasearchers' meaning making of power and partnerships, and their (3) role in taking agency. Based on these findings, we developed a framework of power sharing partnerships to extend Gaventa's power cube theory. This framework, and its five interrelated elements (leadership as power, identifying vision and mission, synergy, interdependent role of partners, and determination), provide insight into the way coresearchers shared their experiences of participatory research methodology. We theorise power-sharing partnerships as a complimentary platform hosting partners' shared strengths, skills, and experience, creating synergy in collaborative projects.

  3. Exploring the squeezed three-point galaxy correlation function with generalized halo occupation distribution models

    NASA Astrophysics Data System (ADS)

    Yuan, Sihan; Eisenstein, Daniel J.; Garrison, Lehman H.

    2018-04-01

    We present the GeneRalized ANd Differentiable Halo Occupation Distribution (GRAND-HOD) routine that generalizes the standard 5 parameter halo occupation distribution model (HOD) with various halo-scale physics and assembly bias. We describe the methodology of 4 different generalizations: satellite distribution generalization, velocity bias, closest approach distance generalization, and assembly bias. We showcase the signatures of these generalizations in the 2-point correlation function (2PCF) and the squeezed 3-point correlation function (squeezed 3PCF). We identify generalized HOD prescriptions that are nearly degenerate in the projected 2PCF and demonstrate that these degeneracies are broken in the redshift-space anisotropic 2PCF and the squeezed 3PCF. We also discuss the possibility of identifying degeneracies in the anisotropic 2PCF and further demonstrate the extra constraining power of the squeezed 3PCF on galaxy-halo connection models. We find that within our current HOD framework, the anisotropic 2PCF can predict the squeezed 3PCF better than its statistical error. This implies that a discordant squeezed 3PCF measurement could falsify the particular HOD model space. Alternatively, it is possible that further generalizations of the HOD model would open opportunities for the squeezed 3PCF to provide novel parameter measurements. The GRAND-HOD Python package is publicly available at https://github.com/SandyYuan/GRAND-HOD.

  4. Using landscape topology to compare continuous metaheuristics: a framework and case study on EDAs and ridge structure.

    PubMed

    Morgan, R; Gallagher, M

    2012-01-01

    In this paper we extend a previously proposed randomized landscape generator in combination with a comparative experimental methodology to study the behavior of continuous metaheuristic optimization algorithms. In particular, we generate two-dimensional landscapes with parameterized, linear ridge structure, and perform pairwise comparisons of algorithms to gain insight into what kind of problems are easy and difficult for one algorithm instance relative to another. We apply this methodology to investigate the specific issue of explicit dependency modeling in simple continuous estimation of distribution algorithms. Experimental results reveal specific examples of landscapes (with certain identifiable features) where dependency modeling is useful, harmful, or has little impact on mean algorithm performance. Heat maps are used to compare algorithm performance over a large number of landscape instances and algorithm trials. Finally, we perform a meta-search in the landscape parameter space to find landscapes which maximize the performance between algorithms. The results are related to some previous intuition about the behavior of these algorithms, but at the same time lead to new insights into the relationship between dependency modeling in EDAs and the structure of the problem landscape. The landscape generator and overall methodology are quite general and extendable and can be used to examine specific features of other algorithms.

  5. Discovering objects in a blood recipient information system.

    PubMed

    Qiu, D; Junghans, G; Marquardt, K; Kroll, H; Mueller-Eckhardt, C; Dudeck, J

    1995-01-01

    Application of object-oriented (OO) methodologies has been generally considered as a solution to the problem of improving the software development process and managing the so-called software crisis. Among them, object-oriented analysis (OOA) is the most essential and is a vital prerequisite for the successful use of other OO methodologies. Though there are already a good deal of OOA methods published, the most important aspect common to all these methods: discovering objects classes truly relevant to the given problem domain, has remained a subject to be intensively researched. In this paper, using the successful development of a blood recipient information system as an example, we present our approach which is based on the conceptual framework of responsibility-driven OOA. In the discussion, we also suggest that it may be inadequate to simply attribute the software crisis to the waterfall model of the software development life-cycle. We are convinced that the real causes for the failure of some software and information systems should be sought in the methodologies used in some crucial phases of the software development process. Furthermore, a software system can also fail if object classes essential to the problem domain are not discovered, implemented and visualized, so that the real-world situation cannot be faithfully traced by it.

  6. Perspectives for elucidating the ethylenediurea (EDU) mode of action for protection against O3 phytotoxicity.

    PubMed

    Agathokleous, Evgenios

    2017-08-01

    Ethylenediurea (EDU) has been widely studied for its effectiveness to protect plants against injuries caused by surface ozone (O 3 ), however its mode of action remains unclear. So far, there is not a unified methodological approach and thus the methodology is quite arbitrary, thereby making it more difficult to generalize findings and understand the EDU mode of action. This review examines the question of whether potential N addition to plants by EDU is a fundamental underlying mechanism in protecting against O 3 phytotoxicity. Yet, this review proposes an evidence-based hypothesis that EDU may protect plants against O 3 deleterious effects upon generation of EDU-induced hormesis, i.e. by activating plant defense at low doses. This hypothesis challenges the future research directions. Revealing a hormesis-based EDU mode of action in protecting plants against O 3 toxicity would have further implications to ecotoxicology and environmental safety. Furthermore, this review discusses the need for further studies on plant metabolism under EDU treatment through relevant experimental approach, and attempts to set the bases for approaching a unified methodology that will contribute in revealing the EDU mode of action. In this framework, focus is given to the main EDU application methods. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. [The GIPSY-RECPAM model: a versatile approach for integrated evaluation in cardiologic care].

    PubMed

    Carinci, F

    2009-01-01

    Tree-structured methodology applied for the GISSI-PSICOLOGIA project, although performed in the framework of earliest GISSI studies, represents a powerful tool to analyze different aspects of cardiologic care. The GISSI-PSICOLOGIA project has delivered a novel methodology based on the joint application of psychometric tools and sophisticated statistical techniques. Its prospective use could allow building effective epidemiological models relevant to the prognosis of the cardiologic patient. The various features of the RECPAM method allow a versatile use in the framework of modern e-health projects. The study used the Cognitive Behavioral Assessment H Form (CBA-H) psychometrics scales. The potential for its future application in the framework of Italian cardiology is relevant and particularly indicated to assist planning of systems for integrated care and routine evaluation of the cardiologic patient.

  8. The added value of thorough economic evaluation of telemedicine networks.

    PubMed

    Le Goff-Pronost, Myriam; Sicotte, Claude

    2010-02-01

    This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.

  9. Transformative Mixed Methods Research

    ERIC Educational Resources Information Center

    Mertens, Donna M.

    2010-01-01

    Paradigms serve as metaphysical frameworks that guide researchers in the identification and clarification of their beliefs with regard to ethics, reality, knowledge, and methodology. The transformative paradigm is explained and illustrated as a framework for researchers who place a priority on social justice and the furtherance of human rights.…

  10. Designing Energy Supply Chains with the P-graph Framework under Cost Constraints and Sustainability Considerations

    EPA Science Inventory

    A computer-aided methodology for designing sustainable supply chains is presented using the P-graph framework to develop supply chain structures which are analyzed using cost, the cost of producing electricity, and two sustainability metrics: ecological footprint and emergy. They...

  11. Designing Energy Supply Chains with the P-Graph Framework under Cost Constraints andSustainability Considerations

    EPA Science Inventory

    A computer-aided methodology for designing sustainable supply chains is presented using the P-graph framework to develop supply chain structures which are analyzed using cost, the cost of producing electricity, and two sustainability metrics: ecological footprint and emergy. They...

  12. Analyzing Agricultural Technology Systems: A Research Report.

    ERIC Educational Resources Information Center

    Swanson, Burton E.

    The International Program for Agricultural Knowledge Systems (INTERPAKS) research team is developing a descriptive and analytic framework to examine and assess agricultural technology systems. The first part of the framework is an inductive methodology that organizes data collection and orders data for comparison between countries. It requires and…

  13. Frameworks of Managerial Competence: Limits, Problems and Suggestions

    ERIC Educational Resources Information Center

    Ruth, Damian

    2006-01-01

    Purpose: To offer a coherent critique of the concept of managerial frameworks of competence through the exploration of the problems of generalizability and abstraction and the "scientific" assumptions of management. Design/methodology/approach: Employs the ecological metaphor of intellectual landscape and extends it to examining the…

  14. Satellite-based terrestrial production efficiency modeling

    PubMed Central

    McCallum, Ian; Wagner, Wolfgang; Schmullius, Christiane; Shvidenko, Anatoly; Obersteiner, Michael; Fritz, Steffen; Nilsson, Sten

    2009-01-01

    Production efficiency models (PEMs) are based on the theory of light use efficiency (LUE) which states that a relatively constant relationship exists between photosynthetic carbon uptake and radiation receipt at the canopy level. Challenges remain however in the application of the PEM methodology to global net primary productivity (NPP) monitoring. The objectives of this review are as follows: 1) to describe the general functioning of six PEMs (CASA; GLO-PEM; TURC; C-Fix; MOD17; and BEAMS) identified in the literature; 2) to review each model to determine potential improvements to the general PEM methodology; 3) to review the related literature on satellite-based gross primary productivity (GPP) and NPP modeling for additional possibilities for improvement; and 4) based on this review, propose items for coordinated research. This review noted a number of possibilities for improvement to the general PEM architecture - ranging from LUE to meteorological and satellite-based inputs. Current PEMs tend to treat the globe similarly in terms of physiological and meteorological factors, often ignoring unique regional aspects. Each of the existing PEMs has developed unique methods to estimate NPP and the combination of the most successful of these could lead to improvements. It may be beneficial to develop regional PEMs that can be combined under a global framework. The results of this review suggest the creation of a hybrid PEM could bring about a significant enhancement to the PEM methodology and thus terrestrial carbon flux modeling. Key items topping the PEM research agenda identified in this review include the following: LUE should not be assumed constant, but should vary by plant functional type (PFT) or photosynthetic pathway; evidence is mounting that PEMs should consider incorporating diffuse radiation; continue to pursue relationships between satellite-derived variables and LUE, GPP and autotrophic respiration (Ra); there is an urgent need for satellite-based biomass measurements to improve Ra estimation; and satellite-based soil moisture data could improve determination of soil water stress. PMID:19765285

  15. Balancing benefit and risk of medicines: a systematic review and classification of available methodologies.

    PubMed

    Mt-Isa, Shahrul; Hallgreen, Christine E; Wang, Nan; Callréus, Torbjörn; Genov, Georgy; Hirsch, Ian; Hobbiger, Stephen F; Hockley, Kimberley S; Luciani, Davide; Phillips, Lawrence D; Quartey, George; Sarac, Sinan B; Stoeckert, Isabelle; Tzoulaki, Ioanna; Micaleff, Alain; Ashby, Deborah

    2014-07-01

    The need for formal and structured approaches for benefit-risk assessment of medicines is increasing, as is the complexity of the scientific questions addressed before making decisions on the benefit-risk balance of medicines. We systematically collected, appraised and classified available benefit-risk methodologies to facilitate and inform their future use. A systematic review of publications identified benefit-risk assessment methodologies. Methodologies were appraised on their fundamental principles, features, graphical representations, assessability and accessibility. We created a taxonomy of methodologies to facilitate understanding and choice. We identified 49 methodologies, critically appraised and classified them into four categories: frameworks, metrics, estimation techniques and utility survey techniques. Eight frameworks describe qualitative steps in benefit-risk assessment and eight quantify benefit-risk balance. Nine metric indices include threshold indices to measure either benefit or risk; health indices measure quality-of-life over time; and trade-off indices integrate benefits and risks. Six estimation techniques support benefit-risk modelling and evidence synthesis. Four utility survey techniques elicit robust value preferences from relevant stakeholders to the benefit-risk decisions. Methodologies to help benefit-risk assessments of medicines are diverse and each is associated with different limitations and strengths. There is not a 'one-size-fits-all' method, and a combination of methods may be needed for each benefit-risk assessment. The taxonomy introduced herein may guide choice of adequate methodologies. Finally, we recommend 13 of 49 methodologies for further appraisal for use in the real-life benefit-risk assessment of medicines. Copyright © 2014 John Wiley & Sons, Ltd.

  16. A general framework for the manual teleoperation of kinematically redundant space-based manipulators

    NASA Astrophysics Data System (ADS)

    Dupuis, Erick

    This thesis provides a general framework for the manual teleoperation of kinematically redundant space-based manipulators. It is proposed to break down the task of controlling the motion of a redundant manipulator into a sequence of manageable sub-tasks of lower dimension by imposing constraints on the motion of intermediate bodies of the manipulator. This implies that the manipulator then becomes a non-redundant kinematic chain and the operator only controls a reduced number of degrees of freedom at any time. However, by appropriately changing the imposed constraints, the operator can use the full capability of the manipulator throughout the task. Also, by not restricting the point of teleoperation to the end effector but effectively allowing direct control of intermediate bodies of the robot, it is possible to teleoperate a redundant robot of arbitrary kinematic architecture over its entire configuration space in a predictable and natural fashion. It is rigourously proven that this approach will always work for any kinematically redundant serial manipulator regardless of its topology, geometry and of the number of its excess degrees-of-freedom. Furthermore, a methodology is provided for the selection of task and constraint coordinates to ensure the absence of algorithmic rank-deficiencies. Two novel algorithms are provided for the symbolic determination of the rank-deficiency locus of rectangular Jacobian matrices: the Singular Vector Algorithm and the Recursive Sub-Determinant Algorithm. These algorithms are complementary to each other: the former being more computationally efficient and the latter more robust. The application of the methodology to sample cases of varying complexity has demonstrated its power and limitations: It has been shown to be powerful enough to generate complete sets of task/constraint coordinate pairs for realistic examples such as the Space Station Remote Manipulator System and a simplified version of the Special Purpose Dexterous Manipulator.

  17. Roadmap for Navy Family Research.

    DTIC Science & Technology

    1980-08-01

    of methodological limitations, including: small, often non -representative or narrowly defined samples; inadequate statistical controls, inadequate...1-1 1.2 Overview of the Research Roadmap ..................... 1-2 2. Methodology ...the Office of Naval Research by the Westinghouse Public Applied Systems Division, and is designed to provide the Navy with a systematic framework for

  18. The Three Stages of Critical Policy Methodology: An Example from Curriculum Analysis

    ERIC Educational Resources Information Center

    Rata, Elizabeth

    2014-01-01

    The article identifies and discusses three stages in the critical policy methodology used in the sociology of education. These are: firstly, employing a political economy theoretical framework that identifies causal links between global forces and local developments; secondly, analysing educational policy within that theoretically conceptualised…

  19. Developing International Managers: The Contribution of Cultural Experience to Learning

    ERIC Educational Resources Information Center

    Townsend, Peter; Regan, Padraic; Li, Liang Liang

    2015-01-01

    Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…

  20. Development of a Teaching Methodology for Undergraduate Human Development in Psychology

    ERIC Educational Resources Information Center

    Rodriguez, Maria A.; Espinoza, José M.

    2015-01-01

    The development of a teaching methodology for the undergraduate Psychology course Human Development II in a private university in Lima, Peru is described. The theoretical framework consisted of an integration of Citizen Science and Service Learning, with the application of Information and Communications Technology (ICT), specifically Wikipedia and…

  1. Applying Threshold Concepts to Finance Education

    ERIC Educational Resources Information Center

    Hoadley, Susan; Wood, Leigh N.; Tickle, Leonie; Kyng, Tim

    2016-01-01

    Purpose: The purpose of this paper is to investigate and identify threshold concepts that are the essential conceptual content of finance programmes. Design/Methodology/Approach: Conducted in three stages with finance academics and students, the study uses threshold concepts as both a theoretical framework and a research methodology. Findings: The…

  2. A Proposed Performance-Based System for Teacher Interactive Electronic Continuous Professional Development (TIE-CPD)

    ERIC Educational Resources Information Center

    Razak, Rafiza Abdul; Yusop, Farrah Dina; Idris, Aizal Yusrina; Al-Sinaiyah, Yanbu; Halili, Siti Hajar

    2016-01-01

    The paper introduces Teacher Interactive Electronic Continuous Professional Development (TIE-CPD), an online interactive training system. The framework and methodology of TIE-CPD are designed with functionalities comparable with existing e-training systems. The system design and development literature offers several methodology and framework…

  3. Capturing Individual Uptake: Toward a Disruptive Research Methodology

    ERIC Educational Resources Information Center

    Bastian, Heather

    2015-01-01

    This article presents and illustrates a qualitative research methodology for studies of uptake. It does so by articulating a theoretical framework for qualitative investigations of uptake and detailing a research study designed to invoke and capture students' uptakes in a first-year writing classroom. The research design sought to make uptake…

  4. Context variations and pluri-methodological issues concerning the expression of a social representation: the example of the Gypsy community.

    PubMed

    Piermattéo, Anthony; Lo Monaco, Grégory; Moreau, Laure; Girandola, Fabien; Tavani, Jean-Louis

    2014-11-20

    Within the social representations' field of research, the "mute zone" hypothesis considers that some objects are characterized by counternormative content that people usually do not express in standard conditions of production. Within the framework of this approach, this study aims to explore the variations in the expression about the Gypsy community following the manipulation of different contexts and the issues associated with a pluri-methodological approach of data analysis. Indeed, two methodologies have been combined. The participants were asked to express themselves in public or in private. In addition, the identity of the experimenter was also manipulated as she presented herself as a Gypsy or not. Then, through a set of analyses based on a methodological triangulation approach, we were able to observe a recurrent modulation of the participants' answers. These analyses highlighted a greater incidence of the expression of counternormative elements when the context of expression was private and especially when the experimenter did not present herself as a Gypsy (p < .01, η p ² = .06). These results will be discussed in terms of the contribution of the methodologies employed and their comparison within the framework of the study of counternormative content.

  5. Development of a competency framework for optometrists with a specialist interest in glaucoma.

    PubMed

    Myint, J; Edgar, D F; Kotecha, A; Crabb, D P; Lawrenson, J G

    2010-09-01

    To develop a competency framework, using a modified Delphi methodology, for optometrists with a specialist interest in glaucoma, which would provide a basis for training and accreditation. A modified iterative Delphi technique was used using a 16-member panel consisting almost exclusively of sub-specialist optometrists and ophthalmologists. The first round involved scoring the relevance of a draft series of competencies using a 9-point Likert scale with a free-text option to modify any competency or suggest additional competencies. The revised framework was subjected to a second round of scoring and free-text comment. The Delphi process was followed by a face-to-face structured workshop to debate and agree the final framework. The version of the framework agreed at the workshop was sent out for a 4-month period of external stakeholder validation. There was a 100% response to round 1 and an 94% response to round 2. All panel members attended the workshop. The final version of the competency framework was validated by a subsequent stakeholder consultation and contained 19 competencies for the diagnosis of glaucoma and 7 further competencies for monitoring and treatment. Application of a consensus methodology consisting of a modified Delphi technique allowed the development of a competency framework for glaucoma specialisation by optometrists. This will help to shape the development of a speciality curriculum and potentially could be adapted for other healthcare professionals.

  6. Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias

    2016-06-25

    This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.

  7. Auditory Hallucinations and the Brain's Resting-State Networks: Findings and Methodological Observations.

    PubMed

    Alderson-Day, Ben; Diederen, Kelly; Fernyhough, Charles; Ford, Judith M; Horga, Guillermo; Margulies, Daniel S; McCarthy-Jones, Simon; Northoff, Georg; Shine, James M; Turner, Jessica; van de Ven, Vincent; van Lutterveld, Remko; Waters, Flavie; Jardri, Renaud

    2016-09-01

    In recent years, there has been increasing interest in the potential for alterations to the brain's resting-state networks (RSNs) to explain various kinds of psychopathology. RSNs provide an intriguing new explanatory framework for hallucinations, which can occur in different modalities and population groups, but which remain poorly understood. This collaboration from the International Consortium on Hallucination Research (ICHR) reports on the evidence linking resting-state alterations to auditory hallucinations (AH) and provides a critical appraisal of the methodological approaches used in this area. In the report, we describe findings from resting connectivity fMRI in AH (in schizophrenia and nonclinical individuals) and compare them with findings from neurophysiological research, structural MRI, and research on visual hallucinations (VH). In AH, various studies show resting connectivity differences in left-hemisphere auditory and language regions, as well as atypical interaction of the default mode network and RSNs linked to cognitive control and salience. As the latter are also evident in studies of VH, this points to a domain-general mechanism for hallucinations alongside modality-specific changes to RSNs in different sensory regions. However, we also observed high methodological heterogeneity in the current literature, affecting the ability to make clear comparisons between studies. To address this, we provide some methodological recommendations and options for future research on the resting state and hallucinations. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.

  8. Can metric-based approaches really improve multi-model climate projections? A perfect model framework applied to summer temperature change in France.

    NASA Astrophysics Data System (ADS)

    Boé, Julien; Terray, Laurent

    2014-05-01

    Ensemble approaches for climate change projections have become ubiquitous. Because of large model-to-model variations and, generally, lack of rationale for the choice of a particular climate model against others, it is widely accepted that future climate change and its impacts should not be estimated based on a single climate model. Generally, as a default approach, the multi-model ensemble mean (MMEM) is considered to provide the best estimate of climate change signals. The MMEM approach is based on the implicit hypothesis that all the models provide equally credible projections of future climate change. This hypothesis is unlikely to be true and ideally one would want to give more weight to more realistic models. A major issue with this alternative approach lies in the assessment of the relative credibility of future climate projections from different climate models, as they can only be evaluated against present-day observations: which present-day metric(s) should be used to decide which models are "good" and which models are "bad" in the future climate? Once a supposedly informative metric has been found, other issues arise. What is the best statistical method to combine multiple models results taking into account their relative credibility measured by a given metric? How to be sure in the end that the metric-based estimate of future climate change is not in fact less realistic than the MMEM? It is impossible to provide strict answers to those questions in the climate change context. Yet, in this presentation, we propose a methodological approach based on a perfect model framework that could bring some useful elements of answer to the questions previously mentioned. The basic idea is to take a random climate model in the ensemble and treat it as if it were the truth (results of this model, in both past and future climate, are called "synthetic observations"). Then, all the other members from the multi-model ensemble are used to derive thanks to a metric-based approach a posterior estimate of climate change, based on the synthetic observation of the metric. Finally, it is possible to compare the posterior estimate to the synthetic observation of future climate change to evaluate the skill of the method. The main objective of this presentation is to describe and apply this perfect model framework to test different methodological issues associated with non-uniform model weighting and similar metric-based approaches. The methodology presented is general, but will be applied to the specific case of summer temperature change in France, for which previous works have suggested potentially useful metrics associated with soil-atmosphere and cloud-temperature interactions. The relative performances of different simple statistical approaches to combine multiple model results based on metrics will be tested. The impact of ensemble size, observational errors, internal variability, and model similarity will be characterized. The potential improvements associated with metric-based approaches compared to the MMEM is terms of errors and uncertainties will be quantified.

  9. Progressive failure methodologies for predicting residual strength and life of laminated composites

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Allen, David H.; Obrien, T. Kevin

    1991-01-01

    Two progressive failure methodologies currently under development by the Mechanics of Materials Branch at NASA Langley Research Center are discussed. The damage tolerance/fail safety methodology developed by O'Brien is an engineering approach to ensuring adequate durability and damage tolerance by treating only delamination onset and the subsequent delamination accumulation through the laminate thickness. The continuum damage model developed by Allen and Harris employs continuum damage laws to predict laminate strength and life. The philosophy, mechanics framework, and current implementation status of each methodology are presented.

  10. MODELING FRAMEWORK FOR EVALUATING SEDIMENTATION IN STREAM NETWORKS: FOR USE IN SEDIMENT TMDL ANALYSIS

    EPA Science Inventory

    A modeling framework that can be used to evaluate sedimentation in stream networks is described. This methodology can be used to determine sediment Total Maximum Daily Loads (TMDLs) in sediment impaired waters, and provide the necessary hydrodynamic and sediment-related data t...

  11. New Educational Services Development: Framework for Technology Entrepreneurship Education at Universities in Egypt

    ERIC Educational Resources Information Center

    Abou-Warda, Sherein Hamed

    2016-01-01

    Purpose: The overall objective of the current study is to explore how universities can better developing new educational services. The purpose of this paper is to develop framework for technology entrepreneurship education (TEPE) within universities. Design/Methodology/Approach: Qualitative and quantitative research approaches were employed. This…

  12. A Framework for Implementing TQM in Higher Education Programs

    ERIC Educational Resources Information Center

    Venkatraman, Sitalakshmi

    2007-01-01

    Purpose: This paper aims to provide a TQM framework that stresses continuous improvements in teaching as a plausible means of TQM implementation in higher education programs. Design/methodology/approach: The literature survey of the TQM philosophies and the comparative analysis of TQM adoption in industry versus higher education provide the…

  13. Leveraging Competency Framework to Improve Teaching and Learning: A Methodological Approach

    ERIC Educational Resources Information Center

    Shankararaman, Venky; Ducrot, Joelle

    2016-01-01

    A number of engineering education programs have defined learning outcomes and course-level competencies, and conducted assessments at the program level to determine areas for continuous improvement. However, many of these programs have not implemented a comprehensive competency framework to support the actual delivery and assessment of an…

  14. A Study of the Inter-Organizational Behavior in Consortia. Final Report.

    ERIC Educational Resources Information Center

    Silverman, Robert J.

    In an attempt to formulate hypotheses and administrative guidelines for voluntary consortia in higher education, a heuristic framework was devised through which behavioral patterns of consortia member organizations and their representatives could be ascertained. The rationale, the framework, and the methodology of the study are first discussed.…

  15. A Methodological Framework to Analyze Stakeholder Preferences and Propose Strategic Pathways for a Sustainable University

    ERIC Educational Resources Information Center

    Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda

    2016-01-01

    Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…

  16. Channeling the Innovation Stream: A Decision Framework for Selecting Emerging Technologies

    ERIC Educational Resources Information Center

    Sauer, Philip S.

    2010-01-01

    The proliferation of emerging technologies offers opportunity but also presents challenges to defense acquisition decision makers seeking to incorporate those technologies as part of the acquisition process. Assessment frameworks and methodologies found in the literature typically address the primary focus of a sponsoring organization's interest…

  17. Rigorous Measures of Implementation: A Methodological Framework for Evaluating Innovative STEM Programs

    ERIC Educational Resources Information Center

    Cassata-Widera, Amy; Century, Jeanne; Kim, Dae Y.

    2011-01-01

    The practical need for multidimensional measures of fidelity of implementation (FOI) of reform-based science, technology, engineering, and mathematics (STEM) instructional materials, combined with a theoretical need in the field for a shared conceptual framework that could support accumulating knowledge on specific enacted program elements across…

  18. Scholar-Craftsmanship: Question-Type, Epistemology, Culture of Inquiry, and Personality-Type in Dissertation Research Design

    ERIC Educational Resources Information Center

    Werner, Thomas P.; Rogers, Katrina S.

    2013-01-01

    "Scholar-Craftsmanship" (SC) is a quadrant methodological framework created to help social science doctoral students construct first-time dissertation research. The framework brackets and predicts how epistemological domains, cultures of inquiries, personality indicators, and research question--types can be correlated in dissertation…

  19. Finding the Intersection of the Learning Organization and Learning Transfer: The Significance of Leadership

    ERIC Educational Resources Information Center

    Kim, Jun Hee; Callahan, Jamie L.

    2013-01-01

    Purpose: This article aims to develop a conceptual framework delineating the key dimension of the learning organization which significantly influences learning transfer. Design/methodology/approach: The conceptual framework was developed by analyzing previous studies and synthesizing the results associated with the following four relationships:…

  20. 78 FR 64478 - Request for Comments on the Preliminary Cybersecurity Framework

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... February 26, 2013, and a series of open public workshops. The preliminary Framework was developed in..., methodologies, procedures and processes that align policy, business, and technological approaches to address....nist.gov/itl/cyberframework.cfm . DATES: Comments must be received by 5:00 p.m. Eastern Time December...

  1. 76 FR 61100 - Notification of a Public Meeting of the Science Advisory Board Biogenic Carbon Emissions Panel

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ... demonstrated expertise in forestry, agriculture, measurement and carbon accounting methodologies, land use... draft Accounting Framework for Biogenic CO 2 Emissions from Stationary Sources (September 2011). DATES... review EPA's draft Accounting Framework for Biogenic CO 2 Emissions from Stationary Sources (September...

  2. A comparative analysis of protected area planning and management frameworks

    Treesearch

    Per Nilsen; Grant Tayler

    1997-01-01

    A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...

  3. An Instructional Design Framework to Improve Student Learning in a First-Year Engineering Class

    ERIC Educational Resources Information Center

    Yelamarthi, Kumar; Drake, Eron; Prewett, Matthew

    2016-01-01

    Increasingly, numerous universities have identified benefits of flipped learning environments and have been encouraging instructors to adapt such methodologies in their respective classrooms, at a time when departments are facing significant budget constraints. This article proposes an instructional design framework utilized to strategically…

  4. A Content Analysis of Multinationals' Web Communication Strategies: Cross-Cultural Research Framework and Pre-Testing.

    ERIC Educational Resources Information Center

    Okazaki, Shintaro; Alonso Rivas, Javier

    2002-01-01

    Discussion of research methodology for evaluating the degree of standardization in multinational corporations' online communication strategies across differing cultures focuses on a research framework for cross-cultural comparison of corporate Web pages, applying traditional advertising content study techniques. Describes pre-tests that examined…

  5. Socialization Experiences Resulting from Engineering Teaching Assistantships at Purdue University

    ERIC Educational Resources Information Center

    Mena, Irene B.

    2010-01-01

    The purpose of this study was to explore and understand the types of socialization experiences that result from engineering teaching assistantships. Using situated learning as the theoretical framework and phenomenology as the methodological framework, this study highlights the experiences of 28 engineering doctoral students who worked as…

  6. Evaluating Academic Journals without Impact Factors for Collection Management Decisions.

    ERIC Educational Resources Information Center

    Dilevko, Juris; Atkinson, Esther

    2002-01-01

    Discussion of evaluating academic journals for collection management decisions focuses on a methodological framework for evaluating journals not ranked by impact factors in Journal Citation Reports. Compares nonranked journals with ranked journals and then applies this framework to a case study in the field of medical science. (LRW)

  7. Developing a Transliteracies Framework for a Connected World

    ERIC Educational Resources Information Center

    Stornaiuolo, Amy; Smith, Anna; Phillips, Nathan C.

    2017-01-01

    This article introduces a transliteracies framework to conceptually account for the contingency and instability of literacy practices on the move and to offer a set of methodological tools for investigating these mobilities. Taking the paradox of mobility--the simultaneous restricting or regulation of movement that accompanies mobility--as its…

  8. Assessing Quality of Critical Thought in Online Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa; Baltes, Beate; Lynn, Laura Knight

    2009-01-01

    Purpose: The purpose of this paper is to describe a theoretically based coding framework for an integrated analysis and assessment of critical thinking in online discussion. Design/methodology/approach: The critical thinking assessment framework (TAF) is developed through review of theory and previous research, verified by comparing results to…

  9. Ethical Issues in Instructional Technology: An Exploratory Framework

    ERIC Educational Resources Information Center

    Lucey, Thomas A.; Grant, Michael M.

    2009-01-01

    Purpose: The purpose of this paper is to explore a framework for considering moral K-12 instructional technology. It seeks to examine the extent that development of technology policies consider and respect affected parties interests. Design/methodology/approach: Interpreting morality as an economic concept that involves a reconciliation of…

  10. The Social Construction of Marital Commitment

    ERIC Educational Resources Information Center

    Byrd, Stephanie Ellen

    2009-01-01

    This paper articulates a theoretical framework for understanding how individuals orient themselves toward marital commitment. Using a life history interview methodology and interpretive framework, it examines the orientations toward marital commitment for a sample of women and men, single and married, between the ages of 28 and 35 (N = 75).…

  11. A Nonrigid Kernel-Based Framework for 2D-3D Pose Estimation and 2D Image Segmentation

    PubMed Central

    Sandhu, Romeil; Dambreville, Samuel; Yezzi, Anthony; Tannenbaum, Allen

    2013-01-01

    In this work, we present a nonrigid approach to jointly solving the tasks of 2D-3D pose estimation and 2D image segmentation. In general, most frameworks that couple both pose estimation and segmentation assume that one has exact knowledge of the 3D object. However, under nonideal conditions, this assumption may be violated if only a general class to which a given shape belongs is given (e.g., cars, boats, or planes). Thus, we propose to solve the 2D-3D pose estimation and 2D image segmentation via nonlinear manifold learning of 3D embedded shapes for a general class of objects or deformations for which one may not be able to associate a skeleton model. Thus, the novelty of our method is threefold: First, we present and derive a gradient flow for the task of nonrigid pose estimation and segmentation. Second, due to the possible nonlinear structures of one’s training set, we evolve the preimage obtained through kernel PCA for the task of shape analysis. Third, we show that the derivation for shape weights is general. This allows us to use various kernels, as well as other statistical learning methodologies, with only minimal changes needing to be made to the overall shape evolution scheme. In contrast with other techniques, we approach the nonrigid problem, which is an infinite-dimensional task, with a finite-dimensional optimization scheme. More importantly, we do not explicitly need to know the interaction between various shapes such as that needed for skeleton models as this is done implicitly through shape learning. We provide experimental results on several challenging pose estimation and segmentation scenarios. PMID:20733218

  12. An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective

    DTIC Science & Technology

    2014-12-01

    An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective by Robert A...Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective Robert A Sottilare and Anne M Sinatra Human...2014 4. TITLE AND SUBTITLE An Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) from a Researcher’s or Analyst’s Perspective

  13. Online epistemic communities: theoretical and methodological directions for understanding knowledge co-elaboration in new digital spaces.

    PubMed

    Détienne, Françoise; Barcellini, Flore; Baker, Michael; Burkhardt, Jean-Marie; Fréard, Dominique

    2012-01-01

    This paper presents, illustrates and discusses a generic framework for studying knowledge co-elaboration in online epistemic communities ("OECs"). Our approach is characterised by: considering knowledge co-elaboration as a design activity; distinguishing discussion and production spaces in OECs; characterising participation via the notion of role; fine-grained analyses of meaning, content and communicative functions in interactions. On this basis, three key issues for ergonomics research on OECs are discussed and illustrated by results from our previous studies on OSS and Wikipedia. One issue concerns the interrelation between design (task) and regulation. Whereas design task-oriented activity is distributed among participants, we illustrate that OCEs function with specialised emerging roles of group regulation. However, the task-oriented activity also functions at an interpersonal level, as an interplay of knowledge-based discussion with negotiation of competencies. Another issue concerns the foci of activity on the (designed) knowledge object. Based on a generic task model, we illustrate asymmetry and distinctiveness in tasks' foci of participants. The last issue concerns how design-use mediation is ensured by specific forms of mediation roles in OECs. Finally we discuss the degree of generality of our framework and draw some perspectives for extending our framework to other OECs.

  14. Improved Bayesian Infrasonic Source Localization for regional infrasound

    DOE PAGES

    Blom, Philip S.; Marcillo, Omar; Arrowsmith, Stephen J.

    2015-10-20

    The Bayesian Infrasonic Source Localization (BISL) methodology is examined and simplified providing a generalized method of estimating the source location and time for an infrasonic event and the mathematical framework is used therein. The likelihood function describing an infrasonic detection used in BISL has been redefined to include the von Mises distribution developed in directional statistics and propagation-based, physically derived celerity-range and azimuth deviation models. Frameworks for constructing propagation-based celerity-range and azimuth deviation statistics are presented to demonstrate how stochastic propagation modelling methods can be used to improve the precision and accuracy of the posterior probability density function describing themore » source localization. Infrasonic signals recorded at a number of arrays in the western United States produced by rocket motor detonations at the Utah Test and Training Range are used to demonstrate the application of the new mathematical framework and to quantify the improvement obtained by using the stochastic propagation modelling methods. Moreover, using propagation-based priors, the spatial and temporal confidence bounds of the source decreased by more than 40 per cent in all cases and by as much as 80 per cent in one case. Further, the accuracy of the estimates remained high, keeping the ground truth within the 99 per cent confidence bounds for all cases.« less

  15. From framework to action: the DESIRE approach to combat desertification.

    PubMed

    Hessel, R; Reed, M S; Geeson, N; Ritsema, C J; van Lynden, G; Karavitis, C A; Schwilch, G; Jetten, V; Burger, P; van der Werff Ten Bosch, M J; Verzandvoort, S; van den Elsen, E; Witsenburg, K

    2014-11-01

    It has become increasingly clear that desertification can only be tackled through a multi-disciplinary approach that not only involves scientists but also stakeholders. In the DESIRE project such an approach was taken. As a first step, a conceptual framework was developed in which the factors and processes that may lead to land degradation and desertification were described. Many of these factors do not work independently, but can reinforce or weaken one another, and to illustrate these relationships sustainable management and policy feedback loops were included. This conceptual framework can be applied globally, but can also be made site-specific to take into account that each study site has a unique combination of bio-physical, socio-economic and political conditions. Once the conceptual framework was defined, a methodological framework was developed in which the methodological steps taken in the DESIRE approach were listed and their logic and sequence were explained. The last step was to develop a concrete working plan to put the project into action, involving stakeholders throughout the process. This series of steps, in full or in part, offers explicit guidance for other organizations or projects that aim to reduce land degradation and desertification.

  16. A general framework for complete positivity

    NASA Astrophysics Data System (ADS)

    Dominy, Jason M.; Shabani, Alireza; Lidar, Daniel A.

    2016-01-01

    Complete positivity of quantum dynamics is often viewed as a litmus test for physicality; yet, it is well known that correlated initial states need not give rise to completely positive evolutions. This observation spurred numerous investigations over the past two decades attempting to identify necessary and sufficient conditions for complete positivity. Here, we describe a complete and consistent mathematical framework for the discussion and analysis of complete positivity for correlated initial states of open quantum systems. This formalism is built upon a few simple axioms and is sufficiently general to contain all prior methodologies going back to Pechakas (Phys Rev Lett 73:1060-1062, 1994). The key observation is that initial system-bath states with the same reduced state on the system must evolve under all admissible unitary operators to system-bath states with the same reduced state on the system, in order to ensure that the induced dynamical maps on the system are well defined. Once this consistency condition is imposed, related concepts such as the assignment map and the dynamical maps are uniquely defined. In general, the dynamical maps may not be applied to arbitrary system states, but only to those in an appropriately defined physical domain. We show that the constrained nature of the problem gives rise to not one but three inequivalent types of complete positivity. Using this framework, we elucidate the limitations of recent attempts to provide conditions for complete positivity using quantum discord and the quantum data processing inequality. In particular, we correct the claim made by two of us (Shabani and Lidar in Phys Rev Lett 102:100402-100404, 2009) that vanishing discord is necessary for complete positivity, and explain that it is valid only for a particular class of initial states. The problem remains open, and may require fresh perspectives and new mathematical tools. The formalism presented herein may be one step in that direction.

  17. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  18. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  19. General practitioners' decisions about discontinuation of medication: an explorative study.

    PubMed

    Nixon, Michael Simon; Vendelø, Morten Thanning

    2016-06-20

    Purpose - The purpose of this paper is to investigate how general practitioners' (GPs) decisions about discontinuation of medication are influenced by their institutional context. Design/methodology/approach - In total, 24 GPs were interviewed, three practices were observed and documents were collected. The Gioia methodology was used to analyse data, drawing on a theoretical framework that integrate the sensemaking perspective and institutional theory. Findings - Most GPs, who actively consider discontinuation, are reluctant to discontinue medication, because the safest course of action for GPs is to continue prescriptions, rather than discontinue them. The authors conclude that this is in part due to the ambiguity about the appropriateness of discontinuing medication, experienced by the GPs, and in part because the clinical guidelines do not encourage discontinuation of medication, as they offer GPs a weak frame for discontinuation. Three reasons for this are identified: the guidelines provide dominating triggers for prescribing, they provide weak priming for discontinuation as an option, and they underscore a cognitive constraint against discontinuation. Originality/value - The analysis offers new insights about decision making when discontinuing medication. It also offers one of the first examinations of how the institutional context embedding GPs influences their decisions about discontinuation. For policymakers interested in the discontinuation of medication, the findings suggest that de-stigmatising discontinuation on an institutional level may be beneficial, allowing GPs to better justify discontinuation in light of the ambiguity they experience.

  20. The Researching on Evaluation of Automatic Voltage Control Based on Improved Zoning Methodology

    NASA Astrophysics Data System (ADS)

    Xiao-jun, ZHU; Ang, FU; Guang-de, DONG; Rui-miao, WANG; De-fen, ZHU

    2018-03-01

    According to the present serious phenomenon of increasing size and structure of power system, hierarchically structured automatic voltage control(AVC) has been the researching spot. In the paper, the reduced control model is built and the adaptive reduced control model is researched to improve the voltage control effect. The theories of HCSD, HCVS, SKC and FCM are introduced and the effect on coordinated voltage regulation caused by different zoning methodologies is also researched. The generic framework for evaluating performance of coordinated voltage regulation is built. Finally, the IEEE-96 stsyem is used to divide the network. The 2383-bus Polish system is built to verify that the selection of a zoning methodology affects not only the coordinated voltage regulation operation, but also its robustness to erroneous data and proposes a comprehensive generic framework for evaluating its performance. The New England 39-bus network is used to verify the adaptive reduced control models’ performance.

  1. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation.

  2. Increasing the applicability of wind power projects via a multi-criteria approach: methodology and case study

    NASA Astrophysics Data System (ADS)

    Polatidis, Heracles; Morales, Jan Borràs

    2016-11-01

    In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.

  3. Time, frequency, and time-varying Granger-causality measures in neuroscience.

    PubMed

    Cekic, Sezen; Grandjean, Didier; Renaud, Olivier

    2018-05-20

    This article proposes a systematic methodological review and an objective criticism of existing methods enabling the derivation of time, frequency, and time-varying Granger-causality statistics in neuroscience. The capacity to describe the causal links between signals recorded at different brain locations during a neuroscience experiment is indeed of primary interest for neuroscientists, who often have very precise prior hypotheses about the relationships between recorded brain signals. The increasing interest and the huge number of publications related to this topic calls for this systematic review, which describes the very complex methodological aspects underlying the derivation of these statistics. In this article, we first present a general framework that allows us to review and compare Granger-causality statistics in the time domain, and the link with transfer entropy. Then, the spectral and the time-varying extensions are exposed and discussed together with their estimation and distributional properties. Although not the focus of this article, partial and conditional Granger causality, dynamical causal modelling, directed transfer function, directed coherence, partial directed coherence, and their variant are also mentioned. Copyright © 2018 John Wiley & Sons, Ltd.

  4. Multi-Disciplinary Knowledge Synthesis for Human Health Assessment on Earth and in Space

    NASA Astrophysics Data System (ADS)

    Christakos, G.

    We discuss methodological developments in multi-disciplinary knowledge synthesis (KS) of human health assessment. A theoretical KS framework can provide the rational means for the assimilation of various information bases (general, site-specific etc.) that are relevant to the life system of interest. KS-based techniques produce a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, and generate informative health state predictions across space-time. The underlying epistemic cognition methodology is based on teleologic criteria and stochastic logic principles. The mathematics of KS involves a powerful and versatile spatiotemporal random field model that accounts rigorously for the uncertainty features of the life system and imposes no restriction on the shape of the probability distributions or the form of the predictors. KS theory is instrumental in understanding natural heterogeneities, assessing crucial human exposure correlations and laws of physical change, and explaining toxicokinetic mechanisms and dependencies in a spatiotemporal life system domain. It is hoped that a better understanding of KS fundamentals would generate multi-disciplinary models that are useful for the maintenance of human health on Earth and in Space.

  5. Network support for turn-taking in multimedia collaboration

    NASA Astrophysics Data System (ADS)

    Dommel, Hans-Peter; Garcia-Luna-Aceves, Jose J.

    1997-01-01

    The effectiveness of collaborative multimedia systems depends on the regulation of access to their shared resources, such as continuous media or instruments used concurrently by multiple parties. Existing applications use only simple protocols to mediate such resource contention. Their cooperative rules follow a strict agenda and are largely application-specific. The inherent problem of floor control lacks a systematic methodology. This paper presents a general model on floor control for correct, scalable, fine-grained and fair resource sharing that integrates user interaction with network conditions, and adaptation to various media types. The motion of turn-taking known from psycholinguistics in studies on discourse structure is adapted for this framework. Viewed as a computational analogy to speech communication, online collaboration revolves around dynamically allocated access permissions called floors. The control semantics of floors derives from concurrently control methodology. An explicit specification and verification of a novel distributed Floor Control Protocol are presented. Hosts assume sharing roles that allow for efficient dissemination of control information, agreeing on a floor holder which is granted mutually exclusive access to a resource. Performance analytic aspects of floor control protocols are also briefly discussed.

  6. Structural zeros in high-dimensional data with applications to microbiome studies.

    PubMed

    Kaul, Abhishek; Davidov, Ori; Peddada, Shyamal D

    2017-07-01

    This paper is motivated by the recent interest in the analysis of high-dimensional microbiome data. A key feature of these data is the presence of "structural zeros" which are microbes missing from an observation vector due to an underlying biological process and not due to error in measurement. Typical notions of missingness are unable to model these structural zeros. We define a general framework which allows for structural zeros in the model and propose methods of estimating sparse high-dimensional covariance and precision matrices under this setup. We establish error bounds in the spectral and Frobenius norms for the proposed estimators and empirically verify them with a simulation study. The proposed methodology is illustrated by applying it to the global gut microbiome data of Yatsunenko and others (2012. Human gut microbiome viewed across age and geography. Nature 486, 222-227). Using our methodology we classify subjects according to the geographical location on the basis of their gut microbiome. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Explosion and/or fire risk assessment methodology: a common approach, structured for underground coalmine environments / Metoda szacowania ryzyka wybuchu i pożarów: podejście ogólne, dostosowane do środowiska kopalni podziemnej

    NASA Astrophysics Data System (ADS)

    Cioca, Ionel-Lucian; Moraru, Roland Iosif

    2012-10-01

    In order to meet statutory requirements concerning the workers health and safety, it is necessary for mine managers within Valea Jiului coal basin in Romania to address the potential for underground fires and explosions and their impact on the workforce and the mine ventilation systems. Highlighting the need for a unified and systematic approach of the specific risks, the authors are developing a general framework for fire/explosion risk assessment in gassy mines, based on the quantification of the likelihood of occurrence and gravity of the consequences of such undesired events and employing Root-Cause analysis method. It is emphasized that even a small fire should be regarded as being a major hazard from the point of view of explosion initiation, should a combustible atmosphere arise. The developed methodology, for the assessment of underground fire and explosion risks, is based on the known underground explosion hazards, fire engineering principles and fire test criteria for potentially combustible materials employed in mines.

  8. Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.

    PubMed

    Yalch, Matthew M

    2016-03-01

    Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).

  9. First-principles study of metallic iron interfaces

    NASA Astrophysics Data System (ADS)

    Hung, A.; Yarovsky, I.; Muscat, J.; Russo, S.; Snook, I.; Watts, R. O.

    2002-04-01

    Adhesion between clean, bulk-terminated bcc Fe(1 0 0) and Fe(1 1 0) matched and mismatched surfaces was simulated within the theoretical framework of the density functional theory. The generalized-gradient spin approximation exchange-correlation functional was used in conjunction with a plane wave-ultrasoft pseudopotential representation. The structure and properties of bulk bcc Fe were calculated in order to establish the reliability of the methodology employed, as well as to determine suitably converged values of computational parameters to be used in subsequent surface calculations. Interfaces were modelled using a single supercell approach, with the interfacial separation distance manipulated by the size of vacuum separation between vertically adjacent surface cells. The adhesive energies at discrete interfacial separations were calculated for each interface and the resulting data fitted to the universal binding energy relation (UBER) of Rose et al. [Phys. Rev. Lett. 47 (1981) 675]. An interpretation of the values of the fitted UBER parameters for the four Fe interfaces studied is given. In addition, a discussion on the validity of the employed computational methodology is presented.

  10. An automatic and effective parameter optimization method for model tuning

    NASA Astrophysics Data System (ADS)

    Zhang, T.; Li, L.; Lin, Y.; Xue, W.; Xie, F.; Xu, H.; Huang, X.

    2015-11-01

    Physical parameterizations in general circulation models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time-consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determining the model's sensitivity to the parameters and the other choosing the optimum initial value for those sensitive parameters, are introduced before the downhill simplex method. This new method reduces the number of parameters to be tuned and accelerates the convergence of the downhill simplex method. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9 %. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameter tuning during the model development stage.

  11. Network representation of protein interactions: Theory of graph description and analysis.

    PubMed

    Kurzbach, Dennis

    2016-09-01

    A methodological framework is presented for the graph theoretical interpretation of NMR data of protein interactions. The proposed analysis generalizes the idea of network representations of protein structures by expanding it to protein interactions. This approach is based on regularization of residue-resolved NMR relaxation times and chemical shift data and subsequent construction of an adjacency matrix that represents the underlying protein interaction as a graph or network. The network nodes represent protein residues. Two nodes are connected if two residues are functionally correlated during the protein interaction event. The analysis of the resulting network enables the quantification of the importance of each amino acid of a protein for its interactions. Furthermore, the determination of the pattern of correlations between residues yields insights into the functional architecture of an interaction. This is of special interest for intrinsically disordered proteins, since the structural (three-dimensional) architecture of these proteins and their complexes is difficult to determine. The power of the proposed methodology is demonstrated at the example of the interaction between the intrinsically disordered protein osteopontin and its natural ligand heparin. © 2016 The Protein Society.

  12. Feature Mining and Health Assessment for Gearboxes Using Run-Up/Coast-Down Signals

    PubMed Central

    Zhao, Ming; Lin, Jing; Miao, Yonghao; Xu, Xiaoqiang

    2016-01-01

    Vibration signals measured in the run-up/coast-down (R/C) processes usually carry rich information about the health status of machinery. However, a major challenge in R/C signals analysis lies in how to exploit more diagnostic information, and how this information could be properly integrated to achieve a more reliable maintenance decision. Aiming at this problem, a framework of R/C signals analysis is presented for the health assessment of gearbox. In the proposed methodology, we first investigate the data preprocessing and feature selection issues for R/C signals. Based on that, a sparsity-guided feature enhancement scheme is then proposed to extract the weak phase jitter associated with gear defect. In order for an effective feature mining and integration under R/C, a generalized phase demodulation technique is further established to reveal the evolution of modulation feature with operating speed and rotation angle. The experimental results indicate that the proposed methodology could not only detect the presence of gear damage, but also offer a novel insight into the dynamic behavior of gearbox. PMID:27827831

  13. Development of a rational scale to assess the harm of drugs of potential misuse.

    PubMed

    Nutt, David; King, Leslie A; Saulsbury, William; Blakemore, Colin

    2007-03-24

    Drug misuse and abuse are major health problems. Harmful drugs are regulated according to classification systems that purport to relate to the harms and risks of each drug. However, the methodology and processes underlying classification systems are generally neither specified nor transparent, which reduces confidence in their accuracy and undermines health education messages. We developed and explored the feasibility of the use of a nine-category matrix of harm, with an expert delphic procedure, to assess the harms of a range of illicit drugs in an evidence-based fashion. We also included five legal drugs of misuse (alcohol, khat, solvents, alkyl nitrites, and tobacco) and one that has since been classified (ketamine) for reference. The process proved practicable, and yielded roughly similar scores and rankings of drug harm when used by two separate groups of experts. The ranking of drugs produced by our assessment of harm differed from those used by current regulatory systems. Our methodology offers a systematic framework and process that could be used by national and international regulatory bodies to assess the harm of current and future drugs of abuse.

  14. Feature Mining and Health Assessment for Gearboxes Using Run-Up/Coast-Down Signals.

    PubMed

    Zhao, Ming; Lin, Jing; Miao, Yonghao; Xu, Xiaoqiang

    2016-11-02

    Vibration signals measured in the run-up/coast-down (R/C) processes usually carry rich information about the health status of machinery. However, a major challenge in R/C signals analysis lies in how to exploit more diagnostic information, and how this information could be properly integrated to achieve a more reliable maintenance decision. Aiming at this problem, a framework of R/C signals analysis is presented for the health assessment of gearbox. In the proposed methodology, we first investigate the data preprocessing and feature selection issues for R/C signals. Based on that, a sparsity-guided feature enhancement scheme is then proposed to extract the weak phase jitter associated with gear defect. In order for an effective feature mining and integration under R/C, a generalized phase demodulation technique is further established to reveal the evolution of modulation feature with operating speed and rotation angle. The experimental results indicate that the proposed methodology could not only detect the presence of gear damage, but also offer a novel insight into the dynamic behavior of gearbox.

  15. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.

    PubMed

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-05-15

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.

  16. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data

    PubMed Central

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-01-01

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674

  17. Child Sexual Abuse, Baby Gender, and Intergenerational Psychic Transmission: An Exploratory, Projective Psychoanalytic Approach.

    PubMed

    de Tychey, Claude; Vandelet, Elena; Laurent, Mélanie; Lighezzolo-Alnot, Joelle; Prudent, Cécile; Evrard, Renaud

    2016-04-01

    The aim of this article is to present a French psychoanalytic model of how and to what extent the sequellae of sexual abuse by a male during a girl's childhood are transmitted to the next generation, as a function of the gender of the abused mother's children. The authors conducted a qualitative exploratory study based on the longitudinal follow-up of a woman who had two boys and a girl. They focused on the impact of two general sequellae: separation anxiety and negativity-disqualification of the paternal and/or male figures. From the methodological standpoint, they used a clinical interview to assess the mother, and a projective tool, a storytelling test, to assess the child's personality using content analysis. The results confirm both the merits of the theoretical framework and the relevance of the projective methodology for grasping sequellae transmitted to the child. The sequellae turned out to be markedly different for the two baby genders: rejection for the male, overprotection and ghostly encryption for the female. Avenues for using this tool and model in future quantitative, comparative studies are suggested.

  18. A philosophical analysis of the general methodology of qualitative research: a critical rationalist perspective.

    PubMed

    Rudnick, Abraham

    2014-09-01

    Philosophical discussion of the general methodology of qualitative research, such as that used in some health research, has been inductivist or relativist to date, ignoring critical rationalism as a philosophical approach with which to discuss the general methodology of qualitative research. This paper presents a discussion of the general methodology of qualitative research from a critical rationalist perspective (inspired by Popper), using as an example mental health research. The widespread endorsement of induction in qualitative research is positivist and is suspect, if not false, particularly in relation to the context of justification (or rather theory testing) as compared to the context of discovery (or rather theory generation). Relativism is riddled with philosophical weaknesses and hence it is suspect if not false too. Theory testing is compatible with qualitative research, contrary to much writing about and in qualitative research, as theory testing involves learning from trial and error, which is part of qualitative research, and which may be the form of learning most conducive to generalization. Generalization involves comparison, which is a fundamental methodological requirement of any type of research (qualitative or other); hence the traditional grounding of quantitative and experimental research in generalization. Comparison--rather than generalization--is necessary for, and hence compatible with, qualitative research; hence, the common opposition to generalization in qualitative research is misdirected, disregarding whether this opposition's claims are true or false. In conclusion, qualitative research, similar to quantitative and experimental research, assumes comparison as a general methodological requirement, which is necessary for health research.

  19. Nearest neighbors by neighborhood counting.

    PubMed

    Wang, Hui

    2006-06-01

    Finding nearest neighbors is a general idea that underlies many artificial intelligence tasks, including machine learning, data mining, natural language understanding, and information retrieval. This idea is explicitly used in the k-nearest neighbors algorithm (kNN), a popular classification method. In this paper, this idea is adopted in the development of a general methodology, neighborhood counting, for devising similarity functions. We turn our focus from neighbors to neighborhoods, a region in the data space covering the data point in question. To measure the similarity between two data points, we consider all neighborhoods that cover both data points. We propose to use the number of such neighborhoods as a measure of similarity. Neighborhood can be defined for different types of data in different ways. Here, we consider one definition of neighborhood for multivariate data and derive a formula for such similarity, called neighborhood counting measure or NCM. NCM was tested experimentally in the framework of kNN. Experiments show that NCM is generally comparable to VDM and its variants, the state-of-the-art distance functions for multivariate data, and, at the same time, is consistently better for relatively large k values. Additionally, NCM consistently outperforms HEOM (a mixture of Euclidean and Hamming distances), the "standard" and most widely used distance function for multivariate data. NCM has a computational complexity in the same order as the standard Euclidean distance function and NCM is task independent and works for numerical and categorical data in a conceptually uniform way. The neighborhood counting methodology is proven sound for multivariate data experimentally. We hope it will work for other types of data.

  20. Analysis of individual risk belief structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonn, B.E.; Travis, C.B.; Arrowood, L.

    An interactive computer program developed at Oak Ridge National Laboratory is presented as a methodology to model individualized belief structures. The logic and general strategy of the model is presented for two risk topics: AIDs and toxic waste. Subjects identified desirable and undesirable consequences for each topic and formulated an associative rule linking topic and consequence in either a causal or correlational framework. Likelihood estimates, generated by subjects in several formats (probability, odds statements, etc.), constituted one outcome measure. Additionally, source of belief (personal experience, news media, etc.) and perceived personal and societal impact are reviewed. Briefly, subjects believe thatmore » AIDs causes significant emotional problems, and to a lesser degree, physical health problems whereas toxic waste causes significant environmental problems.« less

  1. Tissue multifractality and hidden Markov model based integrated framework for optimum precancer detection

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sabyasachi; Das, Nandan K.; Kurmi, Indrajit; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2017-10-01

    We report the application of a hidden Markov model (HMM) on multifractal tissue optical properties derived via the Born approximation-based inverse light scattering method for effective discrimination of precancerous human cervical tissue sites from the normal ones. Two global fractal parameters, generalized Hurst exponent and the corresponding singularity spectrum width, computed by multifractal detrended fluctuation analysis (MFDFA), are used here as potential biomarkers. We develop a methodology that makes use of these multifractal parameters by integrating with different statistical classifiers like the HMM and support vector machine (SVM). It is shown that the MFDFA-HMM integrated model achieves significantly better discrimination between normal and different grades of cancer as compared to the MFDFA-SVM integrated model.

  2. Reply [to “Comment on ‘The Zen of Venn’” by Priestley Toulmin

    NASA Astrophysics Data System (ADS)

    Berkman, Paul Arthur

    While Venn diagrams, “strictly speaking,” may not have been designed for the “peritechnical literature” they certainly provide a symbolic framework for integrating concepts beyond the context of “mathematically defined objects.” It is interesting that Toulmin was offended and compelled to protest the application of Venn diagrams that are not bound by his “valid methodology.” Such disciplinary constraints on creativity appear contrary to the original writings of John Venn who esteemed interdisciplinary approaches and argued fiercely against those who objected to his introducing mathematical symbols into logic [Venn, 1894]. “Symbolic Logic” itself was crafted with a view toward a general utility “in the solution of complicated problems” [Venn, 1894].

  3. Feelings and Intersubjectivity in Qualitative Suicide Research.

    PubMed

    Boden, Zoë V R; Gibson, Susanne; Owen, Gareth J; Benson, Outi

    2016-07-01

    In this article, we explore how feelings permeated our qualitative research on suicide. Drawing on phenomenological theory, we argue for the epistemic and ethical importance of the feelings that emerge through research encounters, considering them to be embodied, intersubjective, and multilayered, and requiring careful interpretation through a "reflexivity of feelings." We sketch a tentative framework of the ways that we experienced feelings in our research and give three in-depth examples to illustrate some of the different layers and types of feelings we identified. We reflexively interpret these feelings and their role in our analysis and then discuss some of the ethical and methodological issues related to examining feelings in suicide research, and research more generally. © The Author(s) 2015.

  4. How to apply for research grants in allergology.

    PubMed

    Guillen-Grima, Francisco; Annan, James W; Alvarez, José María Negro; Gómez, José Miguel Sáez; Ontoso, Enrique Aguinaga

    2009-01-01

    This is a guide for grant application for researchers seeking research grants in the field of allergy and related diseases for the first time. It outlines how to organize proposals and the potential issues to be considered in order to fulfil the criteria of the funding bodies and thus improve chances of obtaining the desired funding when applying for a research grant. We will use this paper as an example of a grant proposal to be presented to the FIS "Fondo de Investigación Sanitaria" (Health Research Fund) of Spain. The general framework can be used for a research proposal to any funding agency. The main research designs are reviewed. Other topics such as hypothesis, objectives, methodology, ethics and legal issues, and budget are presented.

  5. Twenty Years of Learning: How To Do Research in Chemical Education. 2003 George C. Pimentel Award, sponsored by Dow Chemical Co.

    NASA Astrophysics Data System (ADS)

    Bodner, George M.

    2004-05-01

    It is twenty years since the first symposium on research in chemical education was held at the American Chemical Society meeting in St. Louis. Over the course of two decades, the number of people who have devoted their careers to doing research on the teaching and learning of chemistry has increased significantly. There have also been significant developments in the methodology for doing research in this area, and in the sophistication of the questions being investigated. This paper tries to summarize some of what the author has learned while working with graduate students pursuing research-based M.S. and/or Ph.D. degrees in chemical education. It describes the three fundamental elements of a good research study—the theoretical framework, the methodological framework, and the guiding research questions—and examines the process by which the choice of theoretical framework is made.

  6. A data-driven feature extraction framework for predicting the severity of condition of congestive heart failure patients.

    PubMed

    Sideris, Costas; Alshurafa, Nabil; Pourhomayoun, Mohammad; Shahmohammadi, Farhad; Samy, Lauren; Sarrafzadeh, Majid

    2015-01-01

    In this paper, we propose a novel methodology for utilizing disease diagnostic information to predict severity of condition for Congestive Heart Failure (CHF) patients. Our methodology relies on a novel, clustering-based, feature extraction framework using disease diagnostic information. To reduce the dimensionality we identify disease clusters using cooccurence frequencies. We then utilize these clusters as features to predict patient severity of condition. We build our clustering and feature extraction algorithm using the 2012 National Inpatient Sample (NIS), Healthcare Cost and Utilization Project (HCUP) which contains 7 million discharge records and ICD-9-CM codes. The proposed framework is tested on Ronald Reagan UCLA Medical Center Electronic Health Records (EHR) from 3041 patients. We compare our cluster-based feature set with another that incorporates the Charlson comorbidity score as a feature and demonstrate an accuracy improvement of up to 14% in the predictability of the severity of condition.

  7. ALLOCATING ENVIRONMENTAL BURDENS ACROSS CO-PRODUCTS TO CREATE A LIFE CYCLE INVENTORY: IS THERE A BEST WAY?

    EPA Science Inventory

    Allocation methodology for creating life cycle inventories is frequently addressed, discussed and debated, yet the methodology continues to be in a state of flux. ISO 14041 puts perspective on the issues but its one-size fits all framework is being challenged. It is clear that ...

  8. U.S. Comparative and International Graduate Programs: An Overview of Programmatic Size, Relevance, Philosophy, and Methodology

    ERIC Educational Resources Information Center

    Drake, Timothy A.

    2011-01-01

    Previous work has concentrated on the epistemological foundation of comparative and international education (CIE) graduate programs. This study focuses on programmatic size, philosophy, methodology, and pedagogy. It begins by reviewing previous studies. It then provides a theoretical framework and describes the size, relevance, content, and…

  9. Methodological Issues in Documentary Ethnography: A Renewed Call for Putting Cameras in the Hands of the People.

    ERIC Educational Resources Information Center

    Huesca, Robert

    The participatory method of image production holds enormous potential for communication and journalism scholars operating out of a critical/cultural framework. The methodological potentials of mechanical reproduction were evident in the 1930s, when Walter Benjamin contributed three enduring concepts: questioning the art/document dichotomy; placing…

  10. Towards a Trans-Disciplinary Methodology for a Game-Based Intervention Development Process

    ERIC Educational Resources Information Center

    Arnab, Sylvester; Clarke, Samantha

    2017-01-01

    The application of game-based learning adds play into educational and instructional contexts. Even though there is a lack of standard methodologies or formulaic frameworks to better inform game-based intervention development, there exist scientific and empirical studies that can serve as benchmarks for establishing scientific validity in terms of…

  11. Constraints, Resources, and Interpretative Schema: Explorations of Teachers' Decisions to Utilize, Under-Utilize or Ignore Technology

    ERIC Educational Resources Information Center

    Pereira-Leon, Maura J.

    2010-01-01

    This three-year study examined how participation in a 10-month technology-enhanced professional development program (PDP) influenced K-12 teachers' decisions to utilize or ignore technology into teaching practices. Carspecken's (1996) qualitative research methodology of Critical Ethnography provided the theoretical and methodological framework to…

  12. Teaching of Computer Science Topics Using Meta-Programming-Based GLOs and LEGO Robots

    ERIC Educational Resources Information Center

    Štuikys, Vytautas; Burbaite, Renata; Damaševicius, Robertas

    2013-01-01

    The paper's contribution is a methodology that integrates two educational technologies (GLO and LEGO robot) to teach Computer Science (CS) topics at the school level. We present the methodology as a framework of 5 components (pedagogical activities, technology driven processes, tools, knowledge transfer actors, and pedagogical outcomes) and…

  13. [The methodological basis of expert assessment of unfavourable outcomes of the stomatological treatment in the framework of civil law proceedings].

    PubMed

    Pigolkin, Iu I; Murzova, T V; Mirzoev, Kh M

    2011-01-01

    The authors discuss peculiarities of the performance of forensic medical expertise in the cases of unfavourable outcomes of the stomatological treatment. The methodological basis of expert assessment has been created to be applied in situations related to the unfavourable outcomes of dental care.

  14. Research Methods in the Social Sciences

    ERIC Educational Resources Information Center

    Somekh, Bridget, Ed.; Lewin, Cathy, Ed.

    2005-01-01

    This book is intended as a resource and an indispensable companion to welcome educators into the community of social science research. While it is recognized that some methodological frameworks are incompatible with others, the overarching premise of the book is to indicate how a wide range of researchers choose a methodology and methods which are…

  15. Changes in HRM in Europe: A Longitudinal Comparative Study among 18 European Countries

    ERIC Educational Resources Information Center

    Nikandrou, Irene; Apospori, Eleni; Papalexandris, Nancy

    2005-01-01

    Purpose: To examine HRM strategies and practices and HRM position within organizations in various cultural, economic and sociopolitical contexts from a longitudinal perspective. Design/methodology/approach: The study uses the 1995 and 1999 Cranet data in a longitudinal methodological framework to explore the changes and trends in 18 European…

  16. Level-Set Methodology on Adaptive Octree Grids

    NASA Astrophysics Data System (ADS)

    Gibou, Frederic; Guittet, Arthur; Mirzadeh, Mohammad; Theillard, Maxime

    2017-11-01

    Numerical simulations of interfacial problems in fluids require a methodology capable of tracking surfaces that can undergo changes in topology and capable to imposing jump boundary conditions in a sharp manner. In this talk, we will discuss recent advances in the level-set framework, in particular one that is based on adaptive grids.

  17. Troubling the Boundaries: Overcoming Methodological Challenges in a Multi-Sectoral and Multi-Jurisdictional HIV/HCV Policy Scoping Review

    ERIC Educational Resources Information Center

    Hare, Kathleen A.; Dubé, Anik; Marshall, Zack; Gahagan, Jacqueline; Harris, Gregory E.; Tucker, Maryanne; Dykeman, Margaret; MacDonald, Jo-Ann

    2016-01-01

    Policy scoping reviews are an effective method for generating evidence-informed policies. However, when applying guiding methodological frameworks to complex policy evidence, numerous, unexpected challenges can emerge. This paper details five challenges experienced and addressed by a policy trainee-led, multi-disciplinary research team, while…

  18. Exploring Emotion in the Higher Education Workplace: Capturing Contrasting Perspectives Using Q Methodology

    ERIC Educational Resources Information Center

    Woods, Charlotte

    2012-01-01

    This article presents an original application of Q methodology in investigating the challenging arena of emotion in the Higher Education (HE) workplace. Q's strength lies in capturing holistic, subjective accounts of complex and contested phenomena but is unusual in employing a statistical procedure within an interpretivist framework. Here Q is…

  19. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    ERIC Educational Resources Information Center

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  20. A Comprehensive Competence-Based Approach in Curriculum Development: Experiences from African and European Contexts

    ERIC Educational Resources Information Center

    Parent, F.; Baulana, R.; Kahombo, G.; Coppieters, Y.; Garant, M.; De Ketele, J.-M.

    2011-01-01

    Objective: To describe the methodological steps of developing an integrated reference guide for competences according to the profile of the healthcare professionals concerned. Design: Human resources in healthcare represent a complex issue, which needs conceptual and methodological frameworks and tools to help one understand reality and the limits…

  1. A Methodological Proposal for Learning Games Selection and Quality Assessment

    ERIC Educational Resources Information Center

    Dondi, Claudio; Moretti, Michela

    2007-01-01

    This paper presents a methodological proposal elaborated in the framework of two European projects dealing with game-based learning, both of which have focused on "quality" aspects in order to create suitable tools that support European educators, practitioners and lifelong learners in selecting and assessing learning games for use in…

  2. Formulating accident occurrence as a survival process.

    PubMed

    Chang, H L; Jovanis, P P

    1990-10-01

    A conceptual framework for accident occurrence is developed based on the principle of the driver as an information processor. The framework underlies the development of a modeling approach that is consistent with the definition of exposure to risk as a repeated trial. Survival theory is proposed as a statistical technique that is consistent with the conceptual structure and allows the exploration of a wide range of factors that contribute to highway operating risk. This survival model of accident occurrence is developed at a disaggregate level, allowing safety researchers to broaden the scope of studies which may be limited by the use of traditional aggregate approaches. An application of the approach to motor carrier safety is discussed as are potential applications to a variety of transportation industries. Lastly, a typology of highway safety research methodologies is developed to compare the properties of four safety methodologies: laboratory experiments, on-the-road studies, multidisciplinary accident investigations, and correlational studies. The survival theory formulation has a mathematical structure that is compatible with each safety methodology, so it may facilitate the integration of findings across methodologies.

  3. Ballast water regulations and the move toward concentration-based numeric discharge limits.

    PubMed

    Albert, Ryan J; Lishman, John M; Saxena, Juhi R

    2013-03-01

    Ballast water from shipping is a principal source for the introduction of nonindigenous species. As a result, numerous government bodies have adopted various ballast water management practices and discharge standards to slow or eliminate the future introduction and dispersal of these nonindigenous species. For researchers studying ballast water issues, understanding the regulatory framework is helpful to define the scope of research needed by policy makers to develop effective regulations. However, for most scientists, this information is difficult to obtain because it is outside the standard scientific literature and often difficult to interpret. This paper provides a brief review of the regulatory framework directed toward scientists studying ballast water and aquatic invasive species issues. We describe different approaches to ballast water management in international, U.S. federal and state, and domestic ballast water regulation. Specifically, we discuss standards established by the International Maritime Organization (IMO), the U.S. Coast Guard and U.S. Environmental Protection Agency, and individual states in the United States including California, New York, and Minnesota. Additionally, outside the United States, countries such as Australia, Canada, and New Zealand have well-established domestic ballast water regulatory regimes. Different approaches to regulation have recently resulted in variations between numeric concentration-based ballast water discharge limits, particularly in the United States, as well as reliance on use of ballast water exchange pending development and adoption of rigorous science-based discharge standards. To date, numeric concentration-based discharge limits have not generally been based upon a thorough application of risk-assessment methodologies. Regulators, making decisions based on the available information and methodologies before them, have consequently established varying standards, or not established standards at all. The review and refinement of ballast water discharge standards by regulatory agencies will benefit from activity by the scientific community to improve and develop more precise risk-assessment methodologies.

  4. Electronic palliative care coordination systems: Devising and testing a methodology for evaluating documentation

    PubMed Central

    Allsop, Matthew J; Kite, Suzanne; McDermott, Sarah; Penn, Naomi; Millares-Martin, Pablo; Bennett, Michael I

    2016-01-01

    Background: The need to improve coordination of care at end of life has driven electronic palliative care coordination systems implementation across the United Kingdom and internationally. No approaches for evaluating electronic palliative care coordination systems use in practice have been developed. Aim: This study outlines and applies an evaluation framework for examining how and when electronic documentation of advance care planning is occurring in end of life care services. Design: A pragmatic, formative process evaluation approach was adopted. The evaluation drew on the Project Review and Objective Evaluation methodology to guide the evaluation framework design, focusing on clinical processes. Setting/participants: Data were extracted from electronic palliative care coordination systems for 82 of 108 general practices across a large UK city. All deaths (n = 1229) recorded on electronic palliative care coordination systems between April 2014 and March 2015 were included to determine the proportion of all deaths recorded, median number of days prior to death that key information was recorded and observations about routine data use. Results: The evaluation identified 26.8% of all deaths recorded on electronic palliative care coordination systems. The median number of days to death was calculated for initiation of an electronic palliative care coordination systems record (31 days), recording a patient’s preferred place of death (8 days) and entry of Do Not Attempt Cardiopulmonary Resuscitation decisions (34 days). Where preferred and actual place of death was documented, these were matching for 75% of patients. Anomalies were identified in coding used during data entry on electronic palliative care coordination systems. Conclusion: This study reports the first methodology for evaluating how and when electronic palliative care coordination systems documentation is occurring. It raises questions about what can be drawn from routine data collected through electronic palliative care coordination systems and outlines considerations for future evaluation. Future evaluations should consider work processes of health professionals using electronic palliative care coordination systems. PMID:27507636

  5. Electronic palliative care coordination systems: Devising and testing a methodology for evaluating documentation.

    PubMed

    Allsop, Matthew J; Kite, Suzanne; McDermott, Sarah; Penn, Naomi; Millares-Martin, Pablo; Bennett, Michael I

    2017-05-01

    The need to improve coordination of care at end of life has driven electronic palliative care coordination systems implementation across the United Kingdom and internationally. No approaches for evaluating electronic palliative care coordination systems use in practice have been developed. This study outlines and applies an evaluation framework for examining how and when electronic documentation of advance care planning is occurring in end of life care services. A pragmatic, formative process evaluation approach was adopted. The evaluation drew on the Project Review and Objective Evaluation methodology to guide the evaluation framework design, focusing on clinical processes. Data were extracted from electronic palliative care coordination systems for 82 of 108 general practices across a large UK city. All deaths ( n = 1229) recorded on electronic palliative care coordination systems between April 2014 and March 2015 were included to determine the proportion of all deaths recorded, median number of days prior to death that key information was recorded and observations about routine data use. The evaluation identified 26.8% of all deaths recorded on electronic palliative care coordination systems. The median number of days to death was calculated for initiation of an electronic palliative care coordination systems record (31 days), recording a patient's preferred place of death (8 days) and entry of Do Not Attempt Cardiopulmonary Resuscitation decisions (34 days). Where preferred and actual place of death was documented, these were matching for 75% of patients. Anomalies were identified in coding used during data entry on electronic palliative care coordination systems. This study reports the first methodology for evaluating how and when electronic palliative care coordination systems documentation is occurring. It raises questions about what can be drawn from routine data collected through electronic palliative care coordination systems and outlines considerations for future evaluation. Future evaluations should consider work processes of health professionals using electronic palliative care coordination systems.

  6. A Case Study of Controlling Crossover in a Selection Hyper-heuristic Framework Using the Multidimensional Knapsack Problem.

    PubMed

    Drake, John H; Özcan, Ender; Burke, Edmund K

    2016-01-01

    Hyper-heuristics are high-level methodologies for solving complex problems that operate on a search space of heuristics. In a selection hyper-heuristic framework, a heuristic is chosen from an existing set of low-level heuristics and applied to the current solution to produce a new solution at each point in the search. The use of crossover low-level heuristics is possible in an increasing number of general-purpose hyper-heuristic tools such as HyFlex and Hyperion. However, little work has been undertaken to assess how best to utilise it. Since a single-point search hyper-heuristic operates on a single candidate solution, and two candidate solutions are required for crossover, a mechanism is required to control the choice of the other solution. The frameworks we propose maintain a list of potential solutions for use in crossover. We investigate the use of such lists at two conceptual levels. First, crossover is controlled at the hyper-heuristic level where no problem-specific information is required. Second, it is controlled at the problem domain level where problem-specific information is used to produce good-quality solutions to use in crossover. A number of selection hyper-heuristics are compared using these frameworks over three benchmark libraries with varying properties for an NP-hard optimisation problem: the multidimensional 0-1 knapsack problem. It is shown that allowing crossover to be managed at the domain level outperforms managing crossover at the hyper-heuristic level in this problem domain.

  7. Public administration and R&D localisation by pharmaceutical and biotech companies: a theoretical framework and the Italian case-study.

    PubMed

    Jommi, Claudio; Paruzzolo, Silvia

    2007-04-01

    This article has two objectives. It firstly provides a general framework for variables that influence R&D (Research and Development) localisation by pharmaceutical and biotech companies. The analysis of R&D localization includes both in-house R&D and contracted R&D. Following a systematic literature search, these variables were classified into four distinct categories: regulatory environment, institutional framework, national systems of innovation and local development and specialisation. The authors highlight that some of these factors directly depend on the action of public administrations (e.g., patent protection, price regulation, public investments in research, and incentives to private companies); others are indirectly influenced by public policies (e.g., GDP growth rate, infrastructures). This theoretical framework was used to analyse the Italian case-study. Pros and cons of the Italian context were investigated from the point of view of multinational pharmaceutical companies and the Italian Association of Biotech Companies. Interviews were chosen as the most appropriate data gathering technique given the exploratory nature of the study of the Italian context. The paper is divided into five parts. A brief introduction provides figures showing that Europe has been loosing positions compared with other Continents and the same has occurred in Italy compared with other EU countries. The second one illustrates the methodology. The third one is focused on variables affecting R&D localisation. In the fourth section the Italian case-study is discussed. Theoretical and empirical findings are summarised and discussed in the conclusions.

  8. Applicability of risk-based management and the need for risk-based economic decision analysis at hazardous waste contaminated sites.

    PubMed

    Khadam, Ibrahim; Kaluarachchi, Jagath J

    2003-07-01

    Decision analysis in subsurface contamination management is generally carried out through a traditional engineering economic viewpoint. However, new advances in human health risk assessment, namely, the probabilistic risk assessment, and the growing awareness of the importance of soft data in the decision-making process, require decision analysis methodologies that are capable of accommodating non-technical and politically biased qualitative information. In this work, we discuss the major limitations of the currently practiced decision analysis framework, which evolves around the definition of risk and cost of risk, and its poor ability to communicate risk-related information. A demonstration using a numerical example was conducted to provide insight on these limitations of the current decision analysis framework. The results from this simple ground water contamination and remediation scenario were identical to those obtained from studies carried out on existing Superfund sites, which suggests serious flaws in the current risk management framework. In order to provide a perspective on how these limitations may be avoided in future formulation of the management framework, more matured and well-accepted approaches to decision analysis in dam safety and the utility industry, where public health and public investment are of great concern, are presented and their applicability in subsurface remediation management is discussed. Finally, in light of the success of the application of risk-based decision analysis in dam safety and the utility industry, potential options for decision analysis in subsurface contamination management are discussed.

  9. A Cultural Evolution Approach to Digital Media.

    PubMed

    Acerbi, Alberto

    2016-01-01

    Digital media have today an enormous diffusion, and their influence on the behavior of a vast part of the human population can hardly be underestimated. In this review I propose that cultural evolution theory, including both a sophisticated view of human behavior and a methodological attitude to modeling and quantitative analysis, provides a useful framework to study the effects and the developments of media in the digital age. I will first give a general presentation of the cultural evolution framework, and I will then introduce this more specific research program with two illustrative topics. The first topic concerns how cultural transmission biases, that is, simple heuristics such as "copy prestigious individuals" or "copy the majority," operate in the novel context of digital media. The existence of transmission biases is generally justified with their adaptivity in small-scale societies. How do they operate in an environment where, for example, prestigious individuals possess not-relevant skills, or popularity is explicitly quantified and advertised? The second aspect relates to fidelity of cultural transmission. Digitally-mediated interactions support cheap and immediate high-fidelity transmission, in opposition, for example, to oral traditions. How does this change the content that is more likely to spread? Overall, I suggest the usefulness of a "long view" to our contemporary digital environment, contextualized in cognitive science and cultural evolution theory, and I discuss how this perspective could help us to understand what is genuinely new and what is not.

  10. A framework for outcome-level evaluation of in-service training of health care workers.

    PubMed

    O'Malley, Gabrielle; Perdue, Thomas; Petracca, Frances

    2013-10-01

    In-service training is a key strategic approach to addressing the severe shortage of health care workers in many countries. However, there is a lack of evidence linking these health care worker trainings to improved health outcomes. In response, the United States President's Emergency Plan for AIDS Relief's Human Resources for Health Technical Working Group initiated a project to develop an outcome-focused training evaluation framework. This paper presents the methods and results of that project. A general inductive methodology was used for the conceptualization and development of the framework. Fifteen key informant interviews were conducted to explore contextual factors, perceived needs, barriers and facilitators affecting the evaluation of training outcomes. In addition, a thematic analysis of 70 published articles reporting health care worker training outcomes identified key themes and categories. These were integrated, synthesized and compared to several existing training evaluation models. This formed an overall typology which was used to draft a new framework. Finally, the framework was refined and validated through an iterative process of feedback, pilot testing and revision. The inductive process resulted in identification of themes and categories, as well as relationships among several levels and types of outcomes. The resulting framework includes nine distinct types of outcomes that can be evaluated, which are organized within three nested levels: individual, organizational and health system/population. The outcome types are: (1) individual knowledge, attitudes and skills; (2) individual performance; (3) individual patient health; (4) organizational systems; (5) organizational performance; (6) organizational-level patient health; (7) health systems; (8) population-level performance; and (9) population-level health. The framework also addresses contextual factors which may influence the outcomes of training, as well as the ability of evaluators to determine training outcomes. In addition, a group of user-friendly resources, the Training Evaluation Framework and Tools (TEFT) were created to help evaluators and stakeholders understand and apply the framework. Feedback from pilot users suggests that using the framework and accompanying tools may support outcome evaluation planning. Further assessment will assist in strengthening guidelines and tools for operationalization.

  11. A framework for outcome-level evaluation of in-service training of health care workers

    PubMed Central

    2013-01-01

    Background In-service training is a key strategic approach to addressing the severe shortage of health care workers in many countries. However, there is a lack of evidence linking these health care worker trainings to improved health outcomes. In response, the United States President’s Emergency Plan for AIDS Relief’s Human Resources for Health Technical Working Group initiated a project to develop an outcome-focused training evaluation framework. This paper presents the methods and results of that project. Methods A general inductive methodology was used for the conceptualization and development of the framework. Fifteen key informant interviews were conducted to explore contextual factors, perceived needs, barriers and facilitators affecting the evaluation of training outcomes. In addition, a thematic analysis of 70 published articles reporting health care worker training outcomes identified key themes and categories. These were integrated, synthesized and compared to several existing training evaluation models. This formed an overall typology which was used to draft a new framework. Finally, the framework was refined and validated through an iterative process of feedback, pilot testing and revision. Results The inductive process resulted in identification of themes and categories, as well as relationships among several levels and types of outcomes. The resulting framework includes nine distinct types of outcomes that can be evaluated, which are organized within three nested levels: individual, organizational and health system/population. The outcome types are: (1) individual knowledge, attitudes and skills; (2) individual performance; (3) individual patient health; (4) organizational systems; (5) organizational performance; (6) organizational-level patient health; (7) health systems; (8) population-level performance; and (9) population-level health. The framework also addresses contextual factors which may influence the outcomes of training, as well as the ability of evaluators to determine training outcomes. In addition, a group of user-friendly resources, the Training Evaluation Framework and Tools (TEFT) were created to help evaluators and stakeholders understand and apply the framework. Conclusions Feedback from pilot users suggests that using the framework and accompanying tools may support outcome evaluation planning. Further assessment will assist in strengthening guidelines and tools for operationalization. PMID:24083635

  12. Systems resilience : a new analytical framework for nuclear nonproliferation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pregenzer, Arian Leigh

    2011-12-01

    This paper introduces the concept of systems resilience as a new framework for thinking about the future of nonproliferation. Resilience refers to the ability of a system to maintain its vital functions in the face of continuous and unpredictable change. The nonproliferation regime can be viewed as a complex system, and key themes from the literature on systems resilience can be applied to the nonproliferation system. Most existing nonproliferation strategies are aimed at stability rather than resilience, and the current nonproliferation system may be over-constrained by the cumulative evolution of strategies, increasing its vulnerability to collapse. The resilience of themore » nonproliferation system can be enhanced by diversifying nonproliferation strategies to include general international capabilities to respond to proliferation and focusing more attention on reducing the motivation to acquire nuclear weapons in the first place. Ideas for future research, include understanding unintended consequences and feedbacks among nonproliferation strategies, developing methodologies for measuring the resilience of the nonproliferation system, and accounting for interactions of the nonproliferation system with other systems on larger and smaller scales.« less

  13. Intelligent control of a planning system for astronaut training.

    PubMed

    Ortiz, J; Chen, G

    1999-07-01

    This work intends to design, analyze and solve, from the systems control perspective, a complex, dynamic, and multiconstrained planning system for generating training plans for crew members of the NASA-led International Space Station. Various intelligent planning systems have been developed within the framework of artificial intelligence. These planning systems generally lack a rigorous mathematical formalism to allow a reliable and flexible methodology for their design, modeling, and performance analysis in a dynamical, time-critical, and multiconstrained environment. Formulating the planning problem in the domain of discrete-event systems under a unified framework such that it can be modeled, designed, and analyzed as a control system will provide a self-contained theory for such planning systems. This will also provide a means to certify various planning systems for operations in the dynamical and complex environments in space. The work presented here completes the design, development, and analysis of an intricate, large-scale, and representative mathematical formulation for intelligent control of a real planning system for Space Station crew training. This planning system has been tested and used at NASA-Johnson Space Center.

  14. The domain interface method: a general-purpose non-intrusive technique for non-conforming domain decomposition problems

    NASA Astrophysics Data System (ADS)

    Cafiero, M.; Lloberas-Valls, O.; Cante, J.; Oliver, J.

    2016-04-01

    A domain decomposition technique is proposed which is capable of properly connecting arbitrary non-conforming interfaces. The strategy essentially consists in considering a fictitious zero-width interface between the non-matching meshes which is discretized using a Delaunay triangulation. Continuity is satisfied across domains through normal and tangential stresses provided by the discretized interface and inserted in the formulation in the form of Lagrange multipliers. The final structure of the global system of equations resembles the dual assembly of substructures where the Lagrange multipliers are employed to nullify the gap between domains. A new approach to handle floating subdomains is outlined which can be implemented without significantly altering the structure of standard industrial finite element codes. The effectiveness of the developed algorithm is demonstrated through a patch test example and a number of tests that highlight the accuracy of the methodology and independence of the results with respect to the framework parameters. Considering its high degree of flexibility and non-intrusive character, the proposed domain decomposition framework is regarded as an attractive alternative to other established techniques such as the mortar approach.

  15. The dissimilarity of species interaction networks.

    PubMed

    Poisot, Timothée; Canard, Elsa; Mouillot, David; Mouquet, Nicolas; Gravel, Dominique

    2012-12-01

    In a context of global changes, and amidst the perpetual modification of community structure undergone by most natural ecosystems, it is more important than ever to understand how species interactions vary through space and time. The integration of biogeography and network theory will yield important results and further our understanding of species interactions. It has, however, been hampered so far by the difficulty to quantify variation among interaction networks. Here, we propose a general framework to study the dissimilarity of species interaction networks over time, space or environments, allowing both the use of quantitative and qualitative data. We decompose network dissimilarity into interactions and species turnover components, so that it is immediately comparable to common measures of β-diversity. We emphasise that scaling up β-diversity of community composition to the β-diversity of interactions requires only a small methodological step, which we foresee will help empiricists adopt this method. We illustrate the framework with a large dataset of hosts and parasites interactions and highlight other possible usages. We discuss a research agenda towards a biogeographical theory of species interactions. © 2012 Blackwell Publishing Ltd/CNRS.

  16. Development of an audit method to assess the prevalence of the ACGME's general competencies in an undergraduate medical education curriculum.

    PubMed

    Mooney, Christopher J; Lurie, Stephen J; Lyness, Jeffrey M; Lambert, David R; Guzick, David S

    2010-10-01

    Despite the use of competency-based frameworks to evaluate physicians, the role of competency-based objectives in undergraduate medical education remains uncertain. By use of an audit methodology, we sought to determine how the six Accreditation Council for Graduate Medical Education (ACGME) competencies, conceptualized as educational domains, would map onto an undergraduate medical curriculum. Standardized audit forms listing required activities were provided to course directors, who were then asked to indicate which of the domains were represented in each activity. Descriptive statistics were calculated. Of 1,500 activities, there was a mean of 2.13 domains per activity. Medical Knowledge was the most prevalent (44%), followed by Patient Care (20%), Interpersonal and Communication Skills (12%), Professionalism (9%), Systems-Based Practice (8%), and Practice-Based Learning and Improvement (7%). There was considerable variation by year and course. The domains provide a useful framework for organizing didactic components. Faculty can also consider activities in light of the domains, providing a vocabulary for instituting curricular change and innovation.

  17. An inexact mixed risk-aversion two-stage stochastic programming model for water resources management under uncertainty.

    PubMed

    Li, W; Wang, B; Xie, Y L; Huang, G H; Liu, L

    2015-02-01

    Uncertainties exist in the water resources system, while traditional two-stage stochastic programming is risk-neutral and compares the random variables (e.g., total benefit) to identify the best decisions. To deal with the risk issues, a risk-aversion inexact two-stage stochastic programming model is developed for water resources management under uncertainty. The model was a hybrid methodology of interval-parameter programming, conditional value-at-risk measure, and a general two-stage stochastic programming framework. The method extends on the traditional two-stage stochastic programming method by enabling uncertainties presented as probability density functions and discrete intervals to be effectively incorporated within the optimization framework. It could not only provide information on the benefits of the allocation plan to the decision makers but also measure the extreme expected loss on the second-stage penalty cost. The developed model was applied to a hypothetical case of water resources management. Results showed that that could help managers generate feasible and balanced risk-aversion allocation plans, and analyze the trade-offs between system stability and economy.

  18. Scoping literature review on the Learning Organisation concept as applied to the health system.

    PubMed

    Akhnif, E; Macq, J; Idrissi Fakhreddine, M O; Meessen, B

    2017-03-01

    ᅟ: There is growing interest in the use of the management concept of a 'learning organisation'. The objective of this review is to explore work undertaken towards the application of this concept to the health sector in general and to reach the goal of universal health coverage in particular. Of interest are the exploration of evaluation frameworks and their application in health. We used a scoping literature review based on the York methodology. We conducted an online search using selected keywords on some of the main databases on health science, selected websites and main reference books on learning organisations. We restricted the focus of our search on sources in the English language only. Inclusive and exclusive criteria were applied to arrive at a final list of articles, from which information was extracted and then selected and inserted in a chart. We identified 263 articles and other documents from our search. From these, 50 articles were selected for a full analysis and 27 articles were used for the summary. The majority of the articles concerned hospital settings (15 articles, 55%). Seven articles (25%) were related to the application of the concept to the health centre setting. Four articles discussed the application of the concept to the health system (14%). Most of the applications involved high-income countries (21 articles, 78%), with only one article being related to a low-income country. We found 13 different frameworks that were applied to different health organisations. The scoping review allowed us to assess applications of the learning organisation concept to the health sector to date. Such applications are still rare, but are increasingly being used. There is no uniform framework thus far, but convergence as for the dimensions that matter is increasing. Many methodological questions remain unanswered. We also identified a gap in terms of the use of this concept in low- and middle-income countries and to the health system as a whole.

  19. Integrated Testing Strategy (ITS) - Opportunities to better use existing data and guide future testing in toxicology.

    PubMed

    Jaworska, Joanna; Hoffmann, Sebastian

    2010-01-01

    The topic of Integrated Testing Strategies (ITS) has attracted considerable attention, and not only because it is supposed to be a central element of REACH, the ambitious European chemical regulation effort. Although what ITSs are supposed to do seems unambiguous, i.e. speeding up hazard and risk assessment while reducing testing costs, not much has been said, except basic conceptual proposals, about the methodologies that would allow execution of these concepts. Although a pressing concern, the topic of ITS has drawn mostly general reviews, broad concepts, and the expression of a clear need for more research on ITS. Published research in the field remains scarce. Solutions for ITS design emerge slowly, most likely due to the methodological challenges of the task, and perhaps also to it its complexity and the need for multidisciplinary collaboration. Along with the challenge, ITS offer a unique opportunity to contribute to the Toxicology of the 21st century by providing frameworks and tools to actually implement 21st century toxicology data in the chemical management and decision making processes. Further, ITS have the potential to significantly contribute to a modernization of the science of risk assessment. Therefore, to advance ITS research we propose a methodical approach to their design and will discuss currently available approaches as well as challenges to overcome. To this end, we define a framework for ITS that will inform toxicological decisions in a systematic, transparent, and consistent way. We review conceptual requirements for ITS developed earlier and present a roadmap to an operational framework that should be probabilistic, hypothesis-driven, and adaptive. Furthermore, we define properties an ITS should have in order to meet the identified requirements and differentiate them from evidence synthesis. Making use of an ITS for skin sensitization, we demonstrate how the proposed ITS concepts can be implemented.

  20. Cost comparisons and methodological heterogeneity in cost-of-illness studies: the example of colorectal cancer.

    PubMed

    Ó Céilleachair, Alan J; Hanly, Paul; Skally, Máiréad; O'Neill, Ciaran; Fitzpatrick, Patricia; Kapur, Kanika; Staines, Anthony; Sharp, Linda

    2013-04-01

    Colorectal cancer (CRC) is the third most common cancer worldwide with over 1 million new cases diagnosed each year. Advances in treatment and survival are likely to have increased lifetime costs of managing the disease. Cost-of-illness (COI) studies are key building blocks in economic evaluations of interventions and comparative effectiveness research. We systematically reviewed and critiqued the COI literature on CRC. We searched several databases for CRC COI studies published in English, between January 2000 and February 2011. Information was abstracted on: setting, patient population, top-down/bottom-up costing, incident/prevalent approach, payer perspective, time horizon, costs included, cost source, and per-person costs. We developed a framework to compare study methodologies and assess homogeneity/heterogeneity. A total of 26 papers met the inclusion criteria. There was extensive methodological heterogeneity. Studies included case-control studies based on claims/reimbursement data (10), examinations of patient charts (5), and analysis of claims data (4). Epidemiological approaches varied (prevalent, 6; incident, 8; mixed, 10; unclear, 4). Time horizons ranged from 1 year postdiagnosis to lifetime. Seventeen studies used top-down costing. Twenty-five studies included healthcare-payer direct medical costs; 2 included indirect costs; 1 considered patient costs. There was broad agreement in how studies accounted for time, but few studies described costs in sufficient detail to allow replication. In general, costs were not comparable between studies. Methodological heterogeneity and lack of transparency made it almost impossible to compare CRC costs between studies or over time. For COI studies to be more useful and robust there is need for clear and rigorous guidelines around methodological and reporting "best practice."

  1. Effectiveness evaluation of the R&D projects in organizations financed by the budget expenses

    NASA Astrophysics Data System (ADS)

    Yakovlev, D.; Yushkov, E.; Pryakhin, A.; Bogatyreova, M.

    2017-01-01

    The issues of R&D project performance and their prospects are closely concerned with knowledge management. In the initial stages of the project development, it is the quality of the project evaluation that is crucial for the result and generation of future knowledge. Currently there does not exist any common methodology for the evaluation of new R&D financed by the budget. Suffice it to say, the assessment of scientific and technical projects (ST projects) varies greatly depending on the type of customer - government or business structures. An extensive methodological groundwork was formed with respect to orders placed by business structures. It included “an internal administrative order” by the company management for the results of STA intended for its own ST divisions. Regretfully this is not the case with state orders in the field of STA although the issue requires state regulation and official methodological support. The article is devoted to methodological assessment of scientific and technical effectiveness of studies performed at the expense of budget funds, and suggests a new concept based on the definition of the cost-effectiveness index. Thus, the study reveals it necessary to extend the previous approach to projects of different levels - micro-, meso-, macro projects. The preliminary results of the research show that there must be a common methodological approach to underpin the financing of projects under government contracts within the framework of budget financing and stock financing. This should be developed as general guidelines as well as recommendations that reflect specific sectors of the public sector, various project levels and forms of financing, as well as different stages of project life cycle.

  2. Revisioning Curriculum in the Age of Transnational Mobility: Towards a Transnational and Transcultural Framework

    ERIC Educational Resources Information Center

    Guo, Shibao; Maitra, Srabani

    2017-01-01

    Under the new mobilities paradigm, migration is conceptualized as circulatory and transnational, moving us beyond the framework of methodological nationalism. Transnational mobility has called into question dominant notions of migrant acculturation or assimilation. Migrants no longer feel obligated to remain tied to or locatable in a…

  3. A Framework for Evaluating and Enhancing Alignment in Self-Regulated Learning Research

    ERIC Educational Resources Information Center

    Dent, Amy L.; Hoyle, Rick H.

    2015-01-01

    We discuss the articles of this special issue with reference to an important yet previously only implicit dimension of study quality: alignment across the theoretical and methodological decisions that collectively define an approach to self-regulated learning. Integrating and extending work by leaders in the field, we propose a framework for…

  4. Food Practices and School Connectedness: A Whole-School Approach

    ERIC Educational Resources Information Center

    Neely, Eva; Walton, Mat; Stephens, Christine

    2016-01-01

    Purpose: The health-promoting schools (HPSs) framework has emerged as a promising model for promoting school connectedness in the school setting. The purpose of this paper is to explore the potential for food practices to promote school connectedness within a HPSs framework. Design/methodology/approach: This study explores food practices within a…

  5. Developing and Managing University-Industry Research Collaborations through a Process Methodology/Industrial Sector Approach

    ERIC Educational Resources Information Center

    Philbin, Simon P.

    2010-01-01

    A management framework has been successfully utilized at Imperial College London in the United Kingdom to improve the process for developing and managing university-industry research collaborations. The framework has been part of a systematic approach to increase the level of research contracts from industrial sources, to strengthen the…

  6. SUPPLY CHAIN OPTIMIZATION FOR SUSTAINABILITY AND PROFITABILITY BY THE P-GRAPH FRAMEWORK

    EPA Science Inventory

    The proposed methodology is an outcome of the collaboration between the Office of Research and Development (ORD) of the U.S. EPA and the research group led by the founders of the P graph framework. U.S. EPA/ORD has substantial creditable experience with the development of indicat...

  7. Transnational Corporations and Strategic Challenges: An Analysis of Knowledge Flows and Competitive Advantage

    ERIC Educational Resources Information Center

    de Pablos, Patricia Ordonez

    2006-01-01

    Purpose: The purpose of this paper is to analyse knowledge transfers in transnational corporations. Design/methodology/approach: The paper develops a conceptual framework for the analysis of knowledge flow transfers in transnationals. Based on this theoretical framework, the paper propose's research hypotheses and builds a causal model that links…

  8. Towards Developing a Theoretical Framework for Measuring Public Sector Managers' Career Success

    ERIC Educational Resources Information Center

    Rasdi, Roziah Mohd; Ismail, Maimunah; Uli, Jegak; Noah, Sidek Mohd

    2009-01-01

    Purpose: The purpose of this paper is to develop a theoretical framework for measuring public sector managers' career success. Design/methodology/approach: The theoretical foundation used in this study is social cognitive career theory. To conduct a literature search, several keywords were identified, i.e. career success, objective and subjective…

  9. Culturally Responsive Positive Behavioral Interventions and Supports. WCER Working Paper No. 2015-9

    ERIC Educational Resources Information Center

    Bal, Aydin

    2015-01-01

    This report presents the underlying theory and methodology of the first framework to operationalize culture and culturally responsiveness in the context of Positive Behavioral Interventions and Supports. Created following a systematic review of literature, this framework was created as a cultural artifact to expand the conceptualization of the…

  10. An Analytic Framework to Support E.Learning Strategy Development

    ERIC Educational Resources Information Center

    Marshall, Stephen J.

    2012-01-01

    Purpose: The purpose of this paper is to discuss and demonstrate the relevance of a new conceptual framework for leading and managing the development of learning and teaching to e.learning strategy development. Design/methodology/approach: After reviewing and discussing the research literature on e.learning in higher education institutions from…

  11. Empowering Chicana/o and Latina: A Framework for High School Counselors

    ERIC Educational Resources Information Center

    Padilla, Alejandro

    2014-01-01

    Using Hipolito-Delgado and Lee's empowerment theory for the professional school counselor as a framework, this qualitative study explored the techniques employed by school counselors to facilitate the empowerment of Chicana/o and Latina/o students in large California urban high schools. The qualitative methodology included in-depth interviews…

  12. Mapping of Supply Chain Learning: A Framework for SMEs

    ERIC Educational Resources Information Center

    Thakkar, Jitesh; Kanda, Arun; Deshmukh, S. G.

    2011-01-01

    Purpose: The aim of this paper is to propose a mapping framework for evaluating supply chain learning potential for the context of small- to medium-sized enterprises (SMEs). Design/methodology/approach: The extracts of recently completed case based research for ten manufacturing SME units and facts reported in the previous research are utilized…

  13. Piecing the Puzzle: A Framework for Developing Intercultural Online Communication Projects in Business Education

    ERIC Educational Resources Information Center

    Crossman, Joanna; Bordia, Sarbari

    2012-01-01

    Purpose: The purpose of this paper is to present a framework based on lessons learnt from a recently completed project aimed at developing intercultural online communication competencies in business students. Design/methodology/approach: The project entailed collaboration between students and staff in business communication courses from an…

  14. The Importance of Theoretical Frameworks and Mathematical Constructs in Designing Digital Tools

    ERIC Educational Resources Information Center

    Trinter, Christine

    2016-01-01

    The increase in availability of educational technologies over the past few decades has not only led to new practice in teaching mathematics but also to new perspectives in research, methodologies, and theoretical frameworks within mathematics education. Hence, the amalgamation of theoretical and pragmatic considerations in digital tool design…

  15. A Unified Framework for the Infection Dynamics of Zoonotic Spillover and Spread.

    PubMed

    Lo Iacono, Giovanni; Cunningham, Andrew A; Fichet-Calvet, Elisabeth; Garry, Robert F; Grant, Donald S; Leach, Melissa; Moses, Lina M; Nichols, Gordon; Schieffelin, John S; Shaffer, Jeffrey G; Webb, Colleen T; Wood, James L N

    2016-09-01

    A considerable amount of disease is transmitted from animals to humans and many of these zoonoses are neglected tropical diseases. As outbreaks of SARS, avian influenza and Ebola have demonstrated, however, zoonotic diseases are serious threats to global public health and are not just problems confined to remote regions. There are two fundamental, and poorly studied, stages of zoonotic disease emergence: 'spillover', i.e. transmission of pathogens from animals to humans, and 'stuttering transmission', i.e. when limited human-to-human infections occur, leading to self-limiting chains of transmission. We developed a transparent, theoretical framework, based on a generalization of Poisson processes with memory of past human infections, that unifies these stages. Once we have quantified pathogen dynamics in the reservoir, with some knowledge of the mechanism of contact, the approach provides a tool to estimate the likelihood of spillover events. Comparisons with independent agent-based models demonstrates the ability of the framework to correctly estimate the relative contributions of human-to-human vs animal transmission. As an illustrative example, we applied our model to Lassa fever, a rodent-borne, viral haemorrhagic disease common in West Africa, for which data on human outbreaks were available. The approach developed here is general and applicable to a range of zoonoses. This kind of methodology is of crucial importance for the scientific, medical and public health communities working at the interface between animal and human diseases to assess the risk associated with the disease and to plan intervention and appropriate control measures. The Lassa case study revealed important knowledge gaps, and opportunities, arising from limited knowledge of the temporal patterns in reporting, abundance of and infection prevalence in, the host reservoir.

  16. A Unified Framework for the Infection Dynamics of Zoonotic Spillover and Spread

    PubMed Central

    Cunningham, Andrew A.; Fichet-Calvet, Elisabeth; Garry, Robert F.; Grant, Donald S.; Leach, Melissa; Moses, Lina M.; Nichols, Gordon; Schieffelin, John S.; Shaffer, Jeffrey G.; Webb, Colleen T.; Wood, James L. N.

    2016-01-01

    A considerable amount of disease is transmitted from animals to humans and many of these zoonoses are neglected tropical diseases. As outbreaks of SARS, avian influenza and Ebola have demonstrated, however, zoonotic diseases are serious threats to global public health and are not just problems confined to remote regions. There are two fundamental, and poorly studied, stages of zoonotic disease emergence: ‘spillover’, i.e. transmission of pathogens from animals to humans, and ‘stuttering transmission’, i.e. when limited human-to-human infections occur, leading to self-limiting chains of transmission. We developed a transparent, theoretical framework, based on a generalization of Poisson processes with memory of past human infections, that unifies these stages. Once we have quantified pathogen dynamics in the reservoir, with some knowledge of the mechanism of contact, the approach provides a tool to estimate the likelihood of spillover events. Comparisons with independent agent-based models demonstrates the ability of the framework to correctly estimate the relative contributions of human-to-human vs animal transmission. As an illustrative example, we applied our model to Lassa fever, a rodent-borne, viral haemorrhagic disease common in West Africa, for which data on human outbreaks were available. The approach developed here is general and applicable to a range of zoonoses. This kind of methodology is of crucial importance for the scientific, medical and public health communities working at the interface between animal and human diseases to assess the risk associated with the disease and to plan intervention and appropriate control measures. The Lassa case study revealed important knowledge gaps, and opportunities, arising from limited knowledge of the temporal patterns in reporting, abundance of and infection prevalence in, the host reservoir. PMID:27588425

  17. Self organising hypothesis networks: a new approach for representing and structuring SAR knowledge

    PubMed Central

    2014-01-01

    Background Combining different sources of knowledge to build improved structure activity relationship models is not easy owing to the variety of knowledge formats and the absence of a common framework to interoperate between learning techniques. Most of the current approaches address this problem by using consensus models that operate at the prediction level. We explore the possibility to directly combine these sources at the knowledge level, with the aim to harvest potentially increased synergy at an earlier stage. Our goal is to design a general methodology to facilitate knowledge discovery and produce accurate and interpretable models. Results To combine models at the knowledge level, we propose to decouple the learning phase from the knowledge application phase using a pivot representation (lingua franca) based on the concept of hypothesis. A hypothesis is a simple and interpretable knowledge unit. Regardless of its origin, knowledge is broken down into a collection of hypotheses. These hypotheses are subsequently organised into hierarchical network. This unification permits to combine different sources of knowledge into a common formalised framework. The approach allows us to create a synergistic system between different forms of knowledge and new algorithms can be applied to leverage this unified model. This first article focuses on the general principle of the Self Organising Hypothesis Network (SOHN) approach in the context of binary classification problems along with an illustrative application to the prediction of mutagenicity. Conclusion It is possible to represent knowledge in the unified form of a hypothesis network allowing interpretable predictions with performances comparable to mainstream machine learning techniques. This new approach offers the potential to combine knowledge from different sources into a common framework in which high level reasoning and meta-learning can be applied; these latter perspectives will be explored in future work. PMID:24959206

  18. The combination of an Environmental Management System and Life Cycle Assessment at the territorial level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mazzi, Anna; Toniolo, Sara; Catto, Stella

    A framework to include a Life Cycle Assessment in the significance evaluation of the environmental aspects of an Environmental Management System has been studied for some industrial sectors, but there is a literature gap at the territorial level, where the indirect impact assessment is crucial. To overcome this criticality, our research proposes the Life Cycle Assessment as a framework to assess environmental aspects of public administration within an Environmental Management System applied at the territorial level. This research is structured in two parts: the design of a new methodological framework and the pilot application for an Italian municipality. The methodologicalmore » framework designed supports Initial Environmental Analysis at the territorial level thanks to the results derived from the impact assessment phase. The pilot application in an Italian municipality EMAS registered demonstrates the applicability of the framework and its effectiveness in evaluating the environmental impact assessment for direct and indirect aspects. Through the discussion of the results, we underline the growing knowledge derived by this research in terms of the reproducibility and consistency of the criteria to define the significance of the direct and indirect environmental aspects for a local public administration. - Highlights: • The combination between Environmental Management System and LCA is studied. • A methodological framework is elaborated and tested at the territorial level. • Life Cycle Impact Assessment supports the evaluation of aspects significance. • The framework assures consistency of evaluation criteria on the studied territory.« less

  19. Using Q Methodology in Quality Improvement Projects.

    PubMed

    Tiernon, Paige; Hensel, Desiree; Roy-Ehri, Leah

    Q methodology consists of a philosophical framework and procedures to identify subjective viewpoints that may not be well understood, but its use in nursing is still quite limited. We describe how Q methodology can be used in quality improvement projects to better understand local viewpoints that act as facilitators or barriers to the implementation of evidence-based practice. We describe the use of Q methodology to identify nurses' attitudes about the provision of skin-to-skin care after cesarean birth. Copyright © 2017 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published by Elsevier Inc. All rights reserved.

  20. Increasing accuracy in the assessment of motion sickness: A construct methodology

    NASA Technical Reports Server (NTRS)

    Stout, Cynthia S.; Cowings, Patricia S.

    1993-01-01

    The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.

Top