Sample records for realistic evaluation framework

  1. Realist complex intervention science: Applying realist principles across all phases of the Medical Research Council framework for developing and evaluating complex interventions

    PubMed Central

    Fletcher, Adam; Jamal, Farah; Moore, Graham; Evans, Rhiannon E.; Murphy, Simon; Bonell, Chris

    2016-01-01

    The integration of realist evaluation principles within randomised controlled trials (‘realist RCTs’) enables evaluations of complex interventions to answer questions about what works, for whom and under what circumstances. This allows evaluators to better develop and refine mid-level programme theories. However, this is only one phase in the process of developing and evaluating complex interventions. We describe and exemplify how social scientists can integrate realist principles across all phases of the Medical Research Council framework. Intervention development, modelling, and feasibility and pilot studies need to theorise the contextual conditions necessary for intervention mechanisms to be activated. Where interventions are scaled up and translated into routine practice, realist principles also have much to offer in facilitating knowledge about longer-term sustainability, benefits and harms. Integrating a realist approach across all phases of complex intervention science is vital for considering the feasibility and likely effects of interventions for different localities and population subgroups. PMID:27478401

  2. Realist Evaluation: An Emerging Theory in Support of Practice.

    ERIC Educational Resources Information Center

    Henry, Gary T., Ed.; Julnes, George, Ed.; Mark, Melvin M., Ed.

    1998-01-01

    The five articles of this sourcebook, organized around the five-component framework for evaluation described by W. Shadish, T. Cook, and L. Leviton (1991), present a new theory of realist evaluation that captures the sensemaking contributions of postpositivism and the sensitivity to values from the constructivist traditions. (SLD)

  3. Evaluating impact of clinical guidelines using a realist evaluation framework.

    PubMed

    Reddy, Sandeep; Wakerman, John; Westhorp, Gill; Herring, Sally

    2015-12-01

    The Remote Primary Health Care Manuals (RPHCM) project team manages the development and publication of clinical protocols and procedures for primary care clinicians practicing in remote Australia. The Central Australian Rural Practitioners Association Standard Treatment Manual, the flagship manual of the RPHCM suite, has been evaluated for accessibility and acceptability in remote clinics three times in its 20-year history. These evaluations did not consider a theory-based framework or a programme theory, resulting in some limitations with the evaluation findings. With the RPHCM having an aim of enabling evidence-based practice in remote clinics and anecdotally reported to do so, testing this empirically for the full suite is vital for both stakeholders and future editions of the RPHCM. The project team utilized a realist evaluation framework to assess how, why and for what the RPHCM were being used by remote practitioners. A theory regarding the circumstances in which the manuals have and have not enabled evidence-based practice in the remote clinical context was tested. The project assessed this theory for all the manuals in the RPHCM suite, across government and aboriginal community-controlled clinics, in three regions of Australia. Implementing a realist evaluation framework to generate robust findings in this context has required innovation in the evaluation design and adaptation by researchers. This article captures the RPHCM team's experience in designing this evaluation. © 2015 John Wiley & Sons, Ltd.

  4. A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging

    NASA Astrophysics Data System (ADS)

    Solomon, Justin; Samei, Ehsan

    2014-11-01

    Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R2) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R2 of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems.

  5. The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)

    NASA Astrophysics Data System (ADS)

    Moradi, I.; Prive, N.; McCarty, W.; Errico, R. M.; Gelaro, R.

    2017-12-01

    This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.

  6. The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)

    NASA Technical Reports Server (NTRS)

    Moradi, Isaac; Prive, Nikki; McCarty, Will; Errico, Ronald M.; Gelaro, Ron

    2017-01-01

    This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.

  7. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods.

    PubMed

    Pommier, Jeanine; Guével, Marie-Renée; Jourdan, Didier

    2010-01-28

    Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community.

  8. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods

    PubMed Central

    2010-01-01

    Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202

  9. The work is never ending: uncovering teamwork sustainability using realistic evaluation.

    PubMed

    Frykman, Mandus; von Thiele Schwarz, Ulrica; Muntlin Athlin, Åsa; Hasson, Henna; Mazzocato, Pamela

    2017-03-20

    Purpose The purpose of this paper is to uncover the mechanisms influencing the sustainability of behavior changes following the implementation of teamwork. Design/methodology/approach Realistic evaluation was combined with a framework (DCOM®) based on applied behavior analysis to study the sustainability of behavior changes two and a half years after the initial implementation of teamwork at an emergency department. The DCOM® framework was used to categorize the mechanisms of behavior change interventions (BCIs) into the four categories of direction, competence, opportunity, and motivation. Non-participant observation and interview data were used. Findings The teamwork behaviors were not sustained. A substantial fallback in managerial activities in combination with a complex context contributed to reduced direction, opportunity, and motivation. Reduced direction made staff members unclear about how and why they should work in teams. Deterioration of opportunity was evident from the lack of problem-solving resources resulting in accumulated barriers to teamwork. Motivation in terms of management support and feedback was reduced. Practical implications The implementation of complex organizational changes in complex healthcare contexts requires continuous adaption and managerial activities well beyond the initial implementation period. Originality/value By integrating the DCOM® framework with realistic evaluation, this study responds to the call for theoretically based research on behavioral mechanisms that can explain how BCIs interact with context and how this interaction influences sustainability.

  10. A unified framework for evaluating the risk of re-identification of text de-identification tools.

    PubMed

    Scaiano, Martin; Middleton, Grant; Arbuckle, Luk; Kolhatkar, Varada; Peyton, Liam; Dowling, Moira; Gipson, Debbie S; El Emam, Khaled

    2016-10-01

    It has become regular practice to de-identify unstructured medical text for use in research using automatic methods, the goal of which is to remove patient identifying information to minimize re-identification risk. The metrics commonly used to determine if these systems are performing well do not accurately reflect the risk of a patient being re-identified. We therefore developed a framework for measuring the risk of re-identification associated with textual data releases. We apply the proposed evaluation framework to a data set from the University of Michigan Medical School. Our risk assessment results are then compared with those that would be obtained using a typical contemporary micro-average evaluation of recall in order to illustrate the difference between the proposed evaluation framework and the current baseline method. We demonstrate how this framework compares against common measures of the re-identification risk associated with an automated text de-identification process. For the probability of re-identification using our evaluation framework we obtained a mean value for direct identifiers of 0.0074 and a mean value for quasi-identifiers of 0.0022. The 95% confidence interval for these estimates were below the relevant thresholds. The threshold for direct identifier risk was based on previously used approaches in the literature. The threshold for quasi-identifiers was determined based on the context of the data release following commonly used de-identification criteria for structured data. Our framework attempts to correct for poorly distributed evaluation corpora, accounts for the data release context, and avoids the often optimistic assumptions that are made using the more traditional evaluation approach. It therefore provides a more realistic estimate of the true probability of re-identification. This framework should be used as a basis for computing re-identification risk in order to more realistically evaluate future text de-identification tools. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies.

    PubMed

    Madill, A; Jordan, A; Shirley, C

    2000-02-01

    The effect of the individual analyst on research findings can create a credibility problem for qualitative approaches from the perspective of evaluative criteria utilized in quantitative psychology. This paper explicates the ways in which objectivity and reliability are understood in qualitative analysis conducted from within three distinct epistemological frameworks: realism, contextual constructionism, and radical constructionism. It is argued that quality criteria utilized in quantitative psychology are appropriate to the evaluation of qualitative analysis only to the extent that it is conducted within a naive or scientific realist framework. The discussion is illustrated with reference to the comparison of two independent grounded theory analyses of identical material. An implication of this illustration is to identify the potential to develop a radical constructionist strand of grounded theory.

  12. Adapting realist synthesis methodology: The case of workplace harassment interventions.

    PubMed

    Carr, Tracey; Quinlan, Elizabeth; Robertson, Susan; Gerrard, Angie

    2017-12-01

    Realist synthesis techniques can be used to assess complex interventions by extracting and synthesizing configurations of contexts, mechanisms, and outcomes found in the literature. Our novel and multi-pronged approach to the realist synthesis of workplace harassment interventions describes our pursuit of theory to link macro and program level theories. After discovering the limitations of a dogmatic approach to realist synthesis, we adapted our search strategy and focused our analysis on a subset of data. We tailored our realist synthesis to understand how, why, and under what circumstances workplace harassment interventions are effective. The result was a conceptual framework to test our theory-based interventions and provide the basis for subsequent realist evaluation. Our experience documented in this article contributes to an understanding of how, under what circumstances, and with what consequences realist synthesis principles can be customized. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Developing a framework for a novel multi-disciplinary, multi-agency intervention(s), to improve medication management in community-dwelling older people on complex medication regimens (MEMORABLE)--a realist synthesis.

    PubMed

    Maidment, Ian; Booth, Andrew; Mullan, Judy; McKeown, Jane; Bailey, Sylvia; Wong, Geoffrey

    2017-07-03

    Medication-related adverse events have been estimated to be responsible for 5700 deaths and cost the UK £750 million annually. This burden falls disproportionately on older people. Outcomes from interventions to optimise medication management are caused by multiple context-sensitive mechanisms. The MEdication Management in Older people: REalist Approaches BAsed on Literature and Evaluation (MEMORABLE) project uses realist synthesis to understand how, why, for whom and in what context interventions, to improve medication management in older people on complex medication regimes residing in the community, work. This realist synthesis uses secondary data and primary data from interviews to develop the programme theory. A realist logic of analysis will synthesise data both within and across the two data sources to inform the design of a complex intervention(s) to help improve medication management in older people. 1. Literature review The review (using realist synthesis) contains five stages to develop an initial programme theory to understand why processes are more or less successful and under which situations: focussing of the research question; developing the initial programme theory; developing the search strategy; selection and appraisal based on relevance and rigour; and data analysis/synthesis to develop and refine the programme theory and context, intervention and mechanism configurations. 2. Realist interviews Realist interviews will explore and refine our understanding of the programme theory developed from the realist synthesis. Up to 30 older people and their informal carers (15 older people with multi-morbidity, 10 informal carers and 5 older people with dementia), and 20 care staff will be interviewed. 3. Developing framework for the intervention(s) Data from the realist synthesis and interviews will be used to refine the programme theory for the intervention(s) to identify: the mechanisms that need to be 'triggered', and the contexts related to these mechanisms. Intervention strategies that change the contexts so the mechanisms are triggered to produce desired outcomes will be developed. Feedback on these strategies will be obtained. This realist synthesis aims to develop a framework (underpinned by our programme theory) for a novel multi-disciplinary, multi-agency intervention(s), to improve medication management in community-dwelling older people on complex medication regimens. PROSPERO CRD42016043506.

  14. A Realist Evaluation Approach to Unpacking the Impacts of the Sentencing Guidelines

    ERIC Educational Resources Information Center

    Hunt, Kim Steven; Sridharan, Sanjeev

    2010-01-01

    Evaluations of complex interventions such as sentencing guidelines provide an opportunity to understand the mechanisms by which policies and programs can impact intermediate and long-term outcomes. There is limited previous discussion of the underlying frameworks by which sentencing guidelines can impact outcomes such as crime rates. Guided by a…

  15. On Connectivity of Wireless Sensor Networks with Directional Antennas

    PubMed Central

    Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081

  16. Efficient evaluation of wireless real-time control networks.

    PubMed

    Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon

    2015-02-11

    In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.

  17. A realist evaluation of the management of a well- performing regional hospital in Ghana

    PubMed Central

    2010-01-01

    Background Realist evaluation offers an interesting approach to evaluation of interventions in complex settings, but has been little applied in health care. We report on a realist case study of a well performing hospital in Ghana and show how such a realist evaluation design can help to overcome the limited external validity of a traditional case study. Methods We developed a realist evaluation framework for hypothesis formulation, data collection, data analysis and synthesis of the findings. Focusing on the role of human resource management in hospital performance, we formulated our hypothesis around the high commitment management concept. Mixed methods were used in data collection, including individual and group interviews, observations and document reviews. Results We found that the human resource management approach (the actual intervention) included induction of new staff, training and personal development, good communication and information sharing, and decentralised decision-making. We identified 3 additional practices: ensuring optimal physical working conditions, access to top managers and managers' involvement on the work floor. Teamwork, recognition and trust emerged as key elements of the organisational climate. Interviewees reported high levels of organisational commitment. The analysis unearthed perceived organisational support and reciprocity as underlying mechanisms that link the management practices with commitment. Methodologically, we found that realist evaluation can be fruitfully used to develop detailed case studies that analyse how management interventions work and in which conditions. Analysing the links between intervention, mechanism and outcome increases the explaining power, while identification of essential context elements improves the usefulness of the findings for decision-makers in other settings (external validity). We also identified a number of practical difficulties and priorities for further methodological development. Conclusion This case suggests that a well-balanced HRM bundle can stimulate organisational commitment of health workers. Such practices can be implemented even with narrow decision spaces. Realist evaluation provides an appropriate approach to increase the usefulness of case studies to managers and policymakers. PMID:20100330

  18. A realist evaluation of the management of a well-performing regional hospital in Ghana.

    PubMed

    Marchal, Bruno; Dedzo, McDamien; Kegels, Guy

    2010-01-25

    Realist evaluation offers an interesting approach to evaluation of interventions in complex settings, but has been little applied in health care. We report on a realist case study of a well performing hospital in Ghana and show how such a realist evaluation design can help to overcome the limited external validity of a traditional case study. We developed a realist evaluation framework for hypothesis formulation, data collection, data analysis and synthesis of the findings. Focusing on the role of human resource management in hospital performance, we formulated our hypothesis around the high commitment management concept. Mixed methods were used in data collection, including individual and group interviews, observations and document reviews. We found that the human resource management approach (the actual intervention) included induction of new staff, training and personal development, good communication and information sharing, and decentralised decision-making. We identified 3 additional practices: ensuring optimal physical working conditions, access to top managers and managers' involvement on the work floor. Teamwork, recognition and trust emerged as key elements of the organisational climate. Interviewees reported high levels of organisational commitment. The analysis unearthed perceived organisational support and reciprocity as underlying mechanisms that link the management practices with commitment. Methodologically, we found that realist evaluation can be fruitfully used to develop detailed case studies that analyse how management interventions work and in which conditions. Analysing the links between intervention, mechanism and outcome increases the explaining power, while identification of essential context elements improves the usefulness of the findings for decision-makers in other settings (external validity). We also identified a number of practical difficulties and priorities for further methodological development. This case suggests that a well-balanced HRM bundle can stimulate organisational commitment of health workers. Such practices can be implemented even with narrow decision spaces. Realist evaluation provides an appropriate approach to increase the usefulness of case studies to managers and policymakers.

  19. Realistic evaluation of an emergency department-based mental health nurse practitioner outpatient service in Australia.

    PubMed

    Wand, Timothy; White, Kathryn; Patching, Joanna

    2011-06-01

    Evaluation of new models of care requires consideration of the complexity inherent within health care programs and their sensitivity to local contextual factors as well as broader community, social and political influences. Evaluation frameworks that are flexible and responsive while maintaining research rigor are therefore required. Realistic evaluation was adopted as the methodology for the implementation and evaluation of an emergency department-based mental health nurse practitioner outpatient service in Sydney, Australia. The aim of realistic evaluation is to generate, test and refine theories of how programs work within a given context. This paper represents the final methodological step from the completed evaluation. A summary of quantitative and qualitative findings from the mixed-methods evaluation is presented, which is transformed into a set of overarching statements or "middle range theories". Middle range theory statements seek to explain the success of a program and provide transferable lessons for practitioners wishing to implement similar programs elsewhere. For example, the research team consider that early consultation with key local stakeholders and emergency department ownership of the project was pivotal to the implementation process. © 2011 Blackwell Publishing Asia Pty Ltd.

  20. Assessing Vocational Interests in the Basque Country Using Paired Comparison Design

    ERIC Educational Resources Information Center

    Elosua, Paula

    2007-01-01

    This article proposes the Thurstonian paired comparison model to assess vocational preferences and uses this approach to evaluate the Realistic, Investigative, Artistic, Social, Enterprise, and Conventional (RIASEC) model in the Basque Country (Spain). First, one unrestricted model is estimated in the Structural Equation Modelling framework using…

  1. An Applied Ecological Framework for Evaluating Infrastructure to Promote Walking and Cycling: The iConnect Study

    PubMed Central

    Bull, Fiona; Powell, Jane; Cooper, Ashley R.; Brand, Christian; Mutrie, Nanette; Preston, John; Rutter, Harry

    2011-01-01

    Improving infrastructure for walking and cycling is increasingly recommended as a means to promote physical activity, prevent obesity, and reduce traffic congestion and carbon emissions. However, limited evidence from intervention studies exists to support this approach. Drawing on classic epidemiological methods, psychological and ecological models of behavior change, and the principles of realistic evaluation, we have developed an applied ecological framework by which current theories about the behavioral effects of environmental change may be tested in heterogeneous and complex intervention settings. Our framework guides study design and analysis by specifying the most important data to be collected and relations to be tested to confirm or refute specific hypotheses and thereby refine the underlying theories. PMID:21233429

  2. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach.

    PubMed

    Gagliardi, Anna R; Légaré, France; Brouwers, Melissa C; Webster, Fiona; Wiljer, David; Badley, Elizabeth; Straus, Sharon

    2011-03-22

    Patient involvement in healthcare represents the means by which to achieve a healthcare system that is responsive to patient needs and values. Characterization and evaluation of strategies for involving patients in their healthcare may benefit from a knowledge translation (KT) approach. The purpose of this knowledge synthesis is to develop a conceptual framework for patient-mediated KT interventions. A preliminary conceptual framework for patient-mediated KT interventions was compiled to describe intended purpose, recipients, delivery context, intervention, and outcomes. A realist review will be conducted in consultation with stakeholders from the arthritis and cancer fields to explore how these interventions work, for whom, and in what contexts. To identify patient-mediated KT interventions in these fields, we will search MEDLINE, the Cochrane Library, and EMBASE from 1995 to 2010; scan references of all eligible studies; and examine five years of tables of contents for journals likely to publish quantitative or qualitative studies that focus on developing, implementing, or evaluating patient-mediated KT interventions. Screening and data collection will be performed independently by two individuals. The conceptual framework of patient-mediated KT options and outcomes could be used by healthcare providers, managers, educationalists, patient advocates, and policy makers to guide program planning, service delivery, and quality improvement and by us and other researchers to evaluate existing interventions or develop new interventions. By raising awareness of options for involving patients in improving their own care, outcomes based on using a KT approach may lead to greater patient-centred care delivery and improved healthcare outcomes.

  3. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach

    PubMed Central

    2011-01-01

    Background Patient involvement in healthcare represents the means by which to achieve a healthcare system that is responsive to patient needs and values. Characterization and evaluation of strategies for involving patients in their healthcare may benefit from a knowledge translation (KT) approach. The purpose of this knowledge synthesis is to develop a conceptual framework for patient-mediated KT interventions. Methods A preliminary conceptual framework for patient-mediated KT interventions was compiled to describe intended purpose, recipients, delivery context, intervention, and outcomes. A realist review will be conducted in consultation with stakeholders from the arthritis and cancer fields to explore how these interventions work, for whom, and in what contexts. To identify patient-mediated KT interventions in these fields, we will search MEDLINE, the Cochrane Library, and EMBASE from 1995 to 2010; scan references of all eligible studies; and examine five years of tables of contents for journals likely to publish quantitative or qualitative studies that focus on developing, implementing, or evaluating patient-mediated KT interventions. Screening and data collection will be performed independently by two individuals. Conclusions The conceptual framework of patient-mediated KT options and outcomes could be used by healthcare providers, managers, educationalists, patient advocates, and policy makers to guide program planning, service delivery, and quality improvement and by us and other researchers to evaluate existing interventions or develop new interventions. By raising awareness of options for involving patients in improving their own care, outcomes based on using a KT approach may lead to greater patient-centred care delivery and improved healthcare outcomes. PMID:21426573

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dale, Virginia H.; Efroymson, Rebecca Ann; Kline, Keith L.

    A framework for selecting and evaluating indicators of bioenergy sustainability is presented. This framework is designed to facilitate decision-making about which indicators are useful for assessing sustainability of bioenergy systems and supporting their deployment. Efforts to develop sustainability indicators in the United States and Europe are reviewed. The first steps of the framework for indicator selection are defining the sustainability goals and other goals for a bioenergy project or program, gaining an understanding of the context, and identifying the values of stakeholders. From the goals, context, and stakeholders, the objectives for analysis and criteria for indicator selection can be developed.more » The user of the framework identifies and ranks indicators, applies them in an assessment, and then evaluates their effectiveness, while identifying gaps that prevent goals from being met, assessing lessons learned, and moving toward best practices. The framework approach emphasizes that the selection of appropriate criteria and indicators is driven by the specific purpose of an analysis. Realistic goals and measures of bioenergy sustainability can be developed systematically with the help of the framework presented here.« less

  5. Interprofessional education in a student-led emergency department: A realist evaluation.

    PubMed

    Ericson, Anne; Löfgren, Susanne; Bolinder, Gunilla; Reeves, Scott; Kitto, Simon; Masiello, Italo

    2017-03-01

    This article reports a realist evaluation undertaken to identify factors that facilitated or hindered the successful implementation of interprofessional clinical training for undergraduate students in an emergency department. A realist evaluation provides a framework for understanding how the context and underlying mechanisms affect the outcome patterns of an intervention. The researchers gathered both qualitative and quantitative data from internal documents, semi-structured interviews, observations, and questionnaires to study what worked, for whom, and under what circumstances in this specific interprofessional setting. The study participants were medical, nursing, and physiotherapy students, their supervisors, and two members of the emergency department's management staff. The data analysis indicated that the emergency ward provided an excellent environment for interprofessional education (IPE), as attested by the students, supervisors, and the clinical managers. An essential prerequisite is that the students have obtained adequate skills to work independently. Exemplary conditions for IPE to work well in an emergency department demand the continuity of effective and encouraging supervision throughout the training period and supervisors who are knowledgeable about developing a team.

  6. MRXCAT: Realistic numerical phantoms for cardiovascular magnetic resonance

    PubMed Central

    2014-01-01

    Background Computer simulations are important for validating novel image acquisition and reconstruction strategies. In cardiovascular magnetic resonance (CMR), numerical simulations need to combine anatomical information and the effects of cardiac and/or respiratory motion. To this end, a framework for realistic CMR simulations is proposed and its use for image reconstruction from undersampled data is demonstrated. Methods The extended Cardiac-Torso (XCAT) anatomical phantom framework with various motion options was used as a basis for the numerical phantoms. Different tissue, dynamic contrast and signal models, multiple receiver coils and noise are simulated. Arbitrary trajectories and undersampled acquisition can be selected. The utility of the framework is demonstrated for accelerated cine and first-pass myocardial perfusion imaging using k-t PCA and k-t SPARSE. Results MRXCAT phantoms allow for realistic simulation of CMR including optional cardiac and respiratory motion. Example reconstructions from simulated undersampled k-t parallel imaging demonstrate the feasibility of simulated acquisition and reconstruction using the presented framework. Myocardial blood flow assessment from simulated myocardial perfusion images highlights the suitability of MRXCAT for quantitative post-processing simulation. Conclusion The proposed MRXCAT phantom framework enables versatile and realistic simulations of CMR including breathhold and free-breathing acquisitions. PMID:25204441

  7. Applying air pollution modelling within a multi-criteria decision analysis framework to evaluate UK air quality policies

    NASA Astrophysics Data System (ADS)

    Chalabi, Zaid; Milojevic, Ai; Doherty, Ruth M.; Stevenson, David S.; MacKenzie, Ian A.; Milner, James; Vieno, Massimo; Williams, Martin; Wilkinson, Paul

    2017-10-01

    A decision support system for evaluating UK air quality policies is presented. It combines the output from a chemistry transport model, a health impact model and other impact models within a multi-criteria decision analysis (MCDA) framework. As a proof-of-concept, the MCDA framework is used to evaluate and compare idealized emission reduction policies in four sectors (combustion in energy and transformation industries, non-industrial combustion plants, road transport and agriculture) and across six outcomes or criteria (mortality, health inequality, greenhouse gas emissions, biodiversity, crop yield and air quality legal compliance). To illustrate a realistic use of the MCDA framework, the relative importance of the criteria were elicited from a number of stakeholders acting as proxy policy makers. In the prototype decision problem, we show that reducing emissions from industrial combustion (followed very closely by road transport and agriculture) is more advantageous than equivalent reductions from the other sectors when all the criteria are taken into account. Extensions of the MCDA framework to support policy makers in practice are discussed.

  8. Fiberfox: facilitating the creation of realistic white matter software phantoms.

    PubMed

    Neher, Peter F; Laun, Frederik B; Stieltjes, Bram; Maier-Hein, Klaus H

    2014-11-01

    Phantom-based validation of diffusion-weighted image processing techniques is an important key to innovation in the field and is widely used. Openly available and user friendly tools for the flexible generation of tailor-made datasets for the specific tasks at hand can greatly facilitate the work of researchers around the world. We present an open-source framework, Fiberfox, that enables (1) the intuitive definition of arbitrary artificial white matter fiber tracts, (2) signal generation from those fibers by means of the most recent multi-compartment modeling techniques, and (3) simulation of the actual MR acquisition that allows for the introduction of realistic MRI-related effects into the final image. We show that real acquisitions can be closely approximated by simulating the acquisition of the well-known FiberCup phantom. We further demonstrate the advantages of our framework by evaluating the effects of imaging artifacts and acquisition settings on the outcome of 12 tractography algorithms. Our findings suggest that experiments on a realistic software phantom might change the conclusions drawn from earlier hardware phantom experiments. Fiberfox may find application in validating and further developing methods such as tractography, super-resolution, diffusion modeling or artifact correction. Copyright © 2013 Wiley Periodicals, Inc.

  9. A framework for selecting indicators of bioenergy sustainability

    DOE PAGES

    Dale, Virginia H.; Efroymson, Rebecca Ann; Kline, Keith L.; ...

    2015-05-11

    A framework for selecting and evaluating indicators of bioenergy sustainability is presented. This framework is designed to facilitate decision-making about which indicators are useful for assessing sustainability of bioenergy systems and supporting their deployment. Efforts to develop sustainability indicators in the United States and Europe are reviewed. The first steps of the framework for indicator selection are defining the sustainability goals and other goals for a bioenergy project or program, gaining an understanding of the context, and identifying the values of stakeholders. From the goals, context, and stakeholders, the objectives for analysis and criteria for indicator selection can be developed.more » The user of the framework identifies and ranks indicators, applies them in an assessment, and then evaluates their effectiveness, while identifying gaps that prevent goals from being met, assessing lessons learned, and moving toward best practices. The framework approach emphasizes that the selection of appropriate criteria and indicators is driven by the specific purpose of an analysis. Realistic goals and measures of bioenergy sustainability can be developed systematically with the help of the framework presented here.« less

  10. Framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms in conjunction with 3D landmark localization and registration

    NASA Astrophysics Data System (ADS)

    Wörz, Stefan; Hoegen, Philipp; Liao, Wei; Müller-Eschner, Matthias; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik; Rohr, Karl

    2016-03-01

    We introduce a framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms. Phantoms are designed using a CAD system and created with a 3D printer, and comprise realistic shapes including branches and pathologies such as abdominal aortic aneurysms (AAA). To transfer ground truth information to the 3D image coordinate system, we use a landmark-based registration scheme utilizing fiducial markers integrated in the phantom design. For accurate 3D localization of the markers we developed a novel 3D parametric intensity model that is directly fitted to the markers in the images. We also performed a quantitative evaluation of different vessel segmentation approaches for a phantom of an AAA.

  11. A Framework for Evaluation and Optimization of Relevance and Novelty-Based Retrieval

    ERIC Educational Resources Information Center

    Lad, Abhimanyu

    2011-01-01

    There has been growing interest in building and optimizing retrieval systems with respect to relevance and novelty of information, which together more realistically reflect the usefulness of a system as perceived by the user. How to combine these criteria into a single metric that can be used to measure as well as optimize retrieval systems is an…

  12. Beyond the realist turn: a socio-material analysis of heart failure self-care.

    PubMed

    McDougall, Allan; Kinsella, Elizabeth Anne; Goldszmidt, Mark; Harkness, Karen; Strachan, Patricia; Lingard, Lorelei

    2018-01-01

    For patients living with chronic illnesses, self-care has been linked with positive outcomes such as decreased hospitalisation, longer lifespan, and improved quality of life. However, despite calls for more and better self-care interventions, behaviour change trials have repeatedly fallen short on demonstrating effectiveness. The literature on heart failure (HF) stands as a case in point, and a growing body of HF studies advocate realist approaches to self-care research and policymaking. We label this trend the 'realist turn' in HF self-care. Realist evaluation and realist interventions emphasise that the relationship between self-care interventions and positive health outcomes is not fixed, but contingent on social context. This paper argues socio-materiality offers a productive framework to expand on the idea of social context in realist accounts of HF self-care. This study draws on 10 interviews as well as researcher reflections from a larger study exploring health care teams for patients with advanced HF. Leveraging insights from actor-network theory (ANT), this study provides two rich narratives about the contextual factors that influence HF self-care. These descriptions portray not self-care contexts but self-care assemblages, which we discuss in light of socio-materiality. © 2018 Foundation for the Sociology of Health & Illness.

  13. A realistic evaluation: the case of protocol-based care

    PubMed Central

    2010-01-01

    Background 'Protocol based care' was envisioned by policy makers as a mechanism for delivering on the service improvement agenda in England. Realistic evaluation is an increasingly popular approach, but few published examples exist, particularly in implementation research. To fill this gap, within this paper we describe the application of a realistic evaluation approach to the study of protocol-based care, whilst sharing findings of relevance about standardising care through the use of protocols, guidelines, and pathways. Methods Situated between positivism and relativism, realistic evaluation is concerned with the identification of underlying causal mechanisms, how they work, and under what conditions. Fundamentally it focuses attention on finding out what works, for whom, how, and in what circumstances. Results In this research, we were interested in understanding the relationships between the type and nature of particular approaches to protocol-based care (mechanisms), within different clinical settings (context), and what impacts this resulted in (outcomes). An evidence review using the principles of realist synthesis resulted in a number of propositions, i.e., context, mechanism, and outcome threads (CMOs). These propositions were then 'tested' through multiple case studies, using multiple methods including non-participant observation, interviews, and document analysis through an iterative analysis process. The initial propositions (conjectured CMOs) only partially corresponded to the findings that emerged during analysis. From the iterative analysis process of scrutinising mechanisms, context, and outcomes we were able to draw out some theoretically generalisable features about what works, for whom, how, and what circumstances in relation to the use of standardised care approaches (refined CMOs). Conclusions As one of the first studies to apply realistic evaluation in implementation research, it was a good fit, particularly given the growing emphasis on understanding how context influences evidence-based practice. The strengths and limitations of the approach are considered, including how to operationalise it and some of the challenges. This approach provided a useful interpretive framework with which to make sense of the multiple factors that were simultaneously at play and being observed through various data sources, and for developing explanatory theory about using standardised care approaches in practice. PMID:20504293

  14. Depicting the logic of three evaluation theories.

    PubMed

    Hansen, Mark; Alkin, Marvin C; Wallace, Tanner Lebaron

    2013-06-01

    Here, we describe the development of logic models depicting three theories of evaluation practice: Practical Participatory (Cousins & Whitmore, 1998), Values-engaged (Greene, 2005a, 2005b), and Emergent Realist (Mark et al., 1998). We begin with a discussion of evaluation theory and the particular theories that were chosen for our analysis. We then outline the steps involved in constructing the models. The theoretical prescriptions and claims represented here follow a logic model template developed at the University Wisconsin-Extension (Taylor-Powell & Henert, 2008), which also closely aligns with Mark's (2008) framework for research on evaluation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. On the nature of data collection for soft-tissue image-to-physical organ registration: a noise characterization study

    NASA Astrophysics Data System (ADS)

    Collins, Jarrod A.; Heiselman, Jon S.; Weis, Jared A.; Clements, Logan W.; Simpson, Amber L.; Jarnagin, William R.; Miga, Michael I.

    2017-03-01

    In image-guided liver surgery (IGLS), sparse representations of the anterior organ surface may be collected intraoperatively to drive image-to-physical space registration. Soft tissue deformation represents a significant source of error for IGLS techniques. This work investigates the impact of surface data quality on current surface based IGLS registration methods. In this work, we characterize the robustness of our IGLS registration methods to noise in organ surface digitization. We study this within a novel human-to-phantom data framework that allows a rapid evaluation of clinically realistic data and noise patterns on a fully characterized hepatic deformation phantom. Additionally, we implement a surface data resampling strategy that is designed to decrease the impact of differences in surface acquisition. For this analysis, n=5 cases of clinical intraoperative data consisting of organ surface and salient feature digitizations from open liver resection were collected and analyzed within our human-to-phantom validation framework. As expected, results indicate that increasing levels of noise in surface acquisition cause registration fidelity to deteriorate. With respect to rigid registration using the raw and resampled data at clinically realistic levels of noise (i.e. a magnitude of 1.5 mm), resampling improved TRE by 21%. In terms of nonrigid registration, registrations using resampled data outperformed the raw data result by 14% at clinically realistic levels and were less susceptible to noise across the range of noise investigated. These results demonstrate the types of analyses our novel human-to-phantom validation framework can provide and indicate the considerable benefits of resampling strategies.

  16. Simulation of bright-field microscopy images depicting pap-smear specimen

    PubMed Central

    Malm, Patrik; Brun, Anders; Bengtsson, Ewert

    2015-01-01

    As digital imaging is becoming a fundamental part of medical and biomedical research, the demand for computer-based evaluation using advanced image analysis is becoming an integral part of many research projects. A common problem when developing new image analysis algorithms is the need of large datasets with ground truth on which the algorithms can be tested and optimized. Generating such datasets is often tedious and introduces subjectivity and interindividual and intraindividual variations. An alternative to manually created ground-truth data is to generate synthetic images where the ground truth is known. The challenge then is to make the images sufficiently similar to the real ones to be useful in algorithm development. One of the first and most widely studied medical image analysis tasks is to automate screening for cervical cancer through Pap-smear analysis. As part of an effort to develop a new generation cervical cancer screening system, we have developed a framework for the creation of realistic synthetic bright-field microscopy images that can be used for algorithm development and benchmarking. The resulting framework has been assessed through a visual evaluation by experts with extensive experience of Pap-smear images. The results show that images produced using our described methods are realistic enough to be mistaken for real microscopy images. The developed simulation framework is very flexible and can be modified to mimic many other types of bright-field microscopy images. © 2015 The Authors. Published by Wiley Periodicals, Inc. on behalf of ISAC PMID:25573002

  17. Flexible Residential Smart Grid Simulation Framework

    NASA Astrophysics Data System (ADS)

    Xiang, Wang

    Different scheduling and coordination algorithms controlling household appliances' operations can potentially lead to energy consumption reduction and/or load balancing in conjunction with different electricity pricing methods used in smart grid programs. In order to easily implement different algorithms and evaluate their efficiency against other ideas, a flexible simulation framework is desirable in both research and business fields. However, such a platform is currently lacking or underdeveloped. In this thesis, we provide a simulation framework to focus on demand side residential energy consumption coordination in response to different pricing methods. This simulation framework, equipped with an appliance consumption library using realistic values, aims to closely represent the average usage of different types of appliances. The simulation results of traditional usage yield close matching values compared to surveyed real life consumption records. Several sample coordination algorithms, pricing schemes, and communication scenarios are also implemented to illustrate the use of the simulation framework.

  18. Exposing the impact of Citizens Advice Bureau services on health: a realist evaluation protocol.

    PubMed

    Forster, N; Dalkin, S M; Lhussier, M; Hodgson, P; Carr, S M

    2016-01-20

    Welfare advice services can be used to address health inequalities, for example, through Citizens Advice Bureau (CAB). Recent reviews highlight evidence for the impact of advice services in improving people's financial position and improving mental health and well-being, daily living and social relationships. There is also some evidence for the impact of advice services in increasing accessibility of health services, and reducing general practitioner appointments and prescriptions. However, direct evidence for the impact of advice services on lifestyle behaviour and physical health is currently much less well established. There is a need for greater empirical testing of theories around the specific mechanisms through which advice services and associated financial or non-financial benefits may generate health improvements. A realist evaluation will be conducted, operationalised in 5 phases: building the explanatory framework; refining the explanatory framework; testing the explanatory framework through empirical data (mixed methods); development of a bespoke data recording template to capture longer term impact; and verification of findings with a range of CAB services. This research will therefore aim to build, refine and test an explanatory framework about how CAB services can be optimally implemented to achieve health improvement. The study was approved by the ethics committee at Northumbria University, UK. Project-related ethical issues are described and quality control aspects of the study are considered. A stakeholder mapping exercise will inform the dissemination of results in order to ensure all relevant institutions and organisations are targeted. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  19. Realistic Simulations of Coronagraphic Observations with Future Space Telescopes

    NASA Astrophysics Data System (ADS)

    Rizzo, M. J.; Roberge, A.; Lincowski, A. P.; Zimmerman, N. T.; Juanola-Parramon, R.; Pueyo, L.; Hu, M.; Harness, A.

    2017-11-01

    We present a framework to simulate realistic observations of future space-based coronagraphic instruments. This gathers state-of-the-art scientific and instrumental expertise allowing robust characterization of future instrument concepts.

  20. Towards a Holistic Framework for the Evaluation of Emergency Plans in Indoor Environments

    PubMed Central

    Serrano, Emilio; Poveda, Geovanny; Garijo, Mercedes

    2014-01-01

    One of the most promising fields for ambient intelligence is the implementation of intelligent emergency plans. Because the use of drills and living labs cannot reproduce social behaviors, such as panic attacks, that strongly affect these plans, the use of agent-based social simulation provides an approach to evaluate these plans more thoroughly. (1) The hypothesis presented in this paper is that there has been little interest in describing the key modules that these simulators must include, such as formally represented knowledge and a realistic simulated sensor model, and especially in providing researchers with tools to reuse, extend and interconnect modules from different works. This lack of interest hinders researchers from achieving a holistic framework for evaluating emergency plans and forces them to reconsider and to implement the same components from scratch over and over. In addition to supporting this hypothesis by considering over 150 simulators, this paper: (2) defines the main modules identified and proposes the use of semantic web technologies as a cornerstone for the aforementioned holistic framework; (3) provides a basic methodology to achieve the framework; (4) identifies the main challenges; and (5) presents an open and free software tool to hint at the potential of such a holistic view of emergency plan evaluation in indoor environments. PMID:24662453

  1. MEG source localization of spatially extended generators of epileptic activity: comparing entropic and hierarchical bayesian approaches.

    PubMed

    Chowdhury, Rasheda Arman; Lina, Jean Marc; Kobayashi, Eliane; Grova, Christophe

    2013-01-01

    Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG) or Magneto-EncephaloGraphy (MEG) signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i) brain activity may be modeled using cortical parcels and (ii) brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP) method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM) and the Hierarchical Bayesian (HB) source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC) analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2) to 30 cm(2), whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.

  2. MEG Source Localization of Spatially Extended Generators of Epileptic Activity: Comparing Entropic and Hierarchical Bayesian Approaches

    PubMed Central

    Chowdhury, Rasheda Arman; Lina, Jean Marc; Kobayashi, Eliane; Grova, Christophe

    2013-01-01

    Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG) or Magneto-EncephaloGraphy (MEG) signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i) brain activity may be modeled using cortical parcels and (ii) brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP) method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM) and the Hierarchical Bayesian (HB) source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC) analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm2 to 30 cm2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered. PMID:23418485

  3. Development of an OSSE Framework for a Global Atmospheric Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald; Errico, Ronald M.; Prive, N.

    2012-01-01

    Observing system simulation experiments (OSSEs) are powerful tools for estimating the usefulness of various configurations of envisioned observing systems and data assimilation techniques. Their utility stems from their being conducted in an entirely simulated context, utilizing simulated observations having simulated errors and drawn from a simulation of the earth's environment. Observations are generated by applying physically based algorithms to the simulated state, such as performed during data assimilation or using other appropriate algorithms. Adding realistic instrument plus representativeness errors, including their biases and correlations, can be critical for obtaining realistic assessments of the impact of a proposed observing system or analysis technique. If estimates of the expected accuracy of proposed observations are realistic, then the OSSE can be also used to learn how best to utilize the new information, accelerating its transition to operations once the real data are available. As with any inferences from simulations, however, it is first imperative that some baseline OSSEs are performed and well validated against corresponding results obtained with a real observing system. This talk provides an overview of, and highlights critical issues related to, the development of an OSSE framework for the tropospheric weather prediction component of the NASA GEOS-5 global atmospheric data assimilation system. The framework includes all existing observations having significant impact on short-term forecast skill. Its validity has been carefully assessed using a range of metrics that can be evaluated in both the OSSE and real contexts, including adjoint-based estimates of observation impact. A preliminary application to the Aeolus Doppler wind lidar mission, scheduled for launch by the European Space Agency in 2014, has also been investigated.

  4. A simulation framework for the CMS Track Trigger electronics

    NASA Astrophysics Data System (ADS)

    Amstutz, C.; Magazzù, G.; Weber, M.; Palla, F.

    2015-03-01

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system.

  5. An evidence-based framework to measure quality of allied health care.

    PubMed

    Grimmer, Karen; Lizarondo, Lucylynn; Kumar, Saravana; Bell, Erica; Buist, Michael; Weinstein, Philip

    2014-02-26

    There is no standard way of describing the complexities of allied health (AH) care, or its quality. AH is an umbrella term which excludes medicine and nursing, and variably includes disciplines which provide therapy, diagnostic, or scientific services. This paper outlines a framework for a standard approach to evaluate the quality of AH therapy services. A realist synthesis framework describing what AH does, how it does it, and what is achieved, was developed. This was populated by the findings of a systematic review of literature published since 1980 reporting concepts of quality relevant to AH. Articles were included on quality measurement concepts, theories, debates, and/or hypothetical frameworks. Of 139 included articles, 21 reported on descriptions of quality potentially relevant to AH. From these, 24 measures of quality were identified, with 15 potentially relating to what AH does, 17 to how AH delivers care, 8 relating to short term functional outcomes, and 9 relating to longer term functional and health system outcomes. A novel evidence-based quality framework was proposed to address the complexity of AH therapies. This should assist in better evaluation of AH processes and outcomes, costs, and evidence-based engagement of AH providers in healthcare teams.

  6. Steps towards incorporating heterogeneities into program theory: A case study of a data-driven approach.

    PubMed

    Sridharan, Sanjeev; Jones, Bobby; Caudill, Barry; Nakaima, April

    2016-10-01

    This paper describes a framework that can help refine program theory through data explorations and stakeholder dialogue. The framework incorporates the following steps: a recognition that program implementation might need to be multi-phased for a number of interventions, the need to take stock of program theory, the application of pattern recognition methods to help identify heterogeneous program mechanisms, and stakeholder dialogue to refine the program. As part of the data exploration, a method known as developmental trajectories is implemented to learn about heterogeneous trajectories of outcomes in longitudinal evaluations. This method identifies trajectory clusters and also can estimate different treatment impacts for the various groups. The framework is highlighted with data collected in an evaluation of an alcohol risk-reduction program delivered in a college fraternity setting. The framework discussed in the paper is informed by a realist focus on "what works for whom under what contexts." The utility of the framework in contributing to a dialogue on heterogeneous mechanism and subsequent implementation is described. The connection of the ideas in paper to a 'learning through principled discovery' approach is also described. Copyright © 2016. Published by Elsevier Ltd.

  7. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation.

    PubMed

    Maluka, Stephen; Kamuzora, Peter; Sansebastián, Miguel; Byskov, Jens; Ndawi, Benedict; Olsen, Øystein E; Hurtig, Anna-Karin

    2011-02-10

    Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full implementation. This study documents an important first step in the effort to introduce the ethical framework A4R into district planning processes. This study supports the idea that a greater involvement and accountability among local actors through the A4R process may increase the legitimacy and fairness of priority-setting decisions. Support from researchers in providing a broader and more detailed analysis of health system elements, and the socio-cultural context, could lead to better prediction of the effects of the innovation and pinpoint stakeholders' concerns, thereby illuminating areas that require special attention to promote sustainability.

  8. Trajectories of self-evaluation bias in primary and secondary school: Parental antecedents and academic consequences.

    PubMed

    Bonneville-Roussy, Arielle; Bouffard, Thérèse; Vezeau, Carole

    2017-08-01

    Using a longitudinal approach spanning nine years of children's formal education, this study investigated the developmental trajectories of self-evaluation bias of academic competence. The study also examined how parenting styles were associated with the trajectories of bias in mid-primary school, and how those trajectories predicted academic outcomes at the end of secondary school and the beginning of college. A total of 711 children in 4th and 5th grades (mean age=10.71years old; 358 girls) participated in this study. Using a latent class growth modeling framework, results indicated that children can be classified in three latent growth trajectories of self-evaluation bias: the optimistic, realistic and pessimistic trajectories. These trajectories differed in their initial status of bias and also in their development over time. Children's adherence to a specific trajectory was associated with parenting variables in childhood. Finally, the optimistic, realistic, or pessimistic trajectories distinctively predicted achievement and persistence. Copyright © 2017 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  9. Evaluation of Capacity-Building Program of District Health Managers in India: A Contextualized Theoretical Framework

    PubMed Central

    Prashanth, N. S.; Marchal, Bruno; Kegels, Guy; Criel, Bart

    2014-01-01

    Performance of local health services managers at district level is crucial to ensure that health services are of good quality and cater to the health needs of the population in the area. In many low- and middle-income countries, health services managers are poorly equipped with public health management capacities needed for planning and managing their local health system. In the south Indian Tumkur district, a consortium of five non-governmental organizations partnered with the state government to organize a capacity-building program for health managers. The program consisted of a mix of periodic contact classes, mentoring and assignments and was spread over 30 months. In this paper, we develop a theoretical framework in the form of a refined program theory to understand how such a capacity-building program could bring about organizational change. A well-formulated program theory enables an understanding of how interventions could bring about improvements and an evaluation of the intervention. In the refined program theory of the intervention, we identified various factors at individual, institutional, and environmental levels that could interact with the hypothesized mechanisms of organizational change, such as staff’s perceived self-efficacy and commitment to their organizations. Based on this program theory, we formulated context–mechanism–outcome configurations that can be used to evaluate the intervention and, more specifically, to understand what worked, for whom and under what conditions. We discuss the application of program theory development in conducting a realist evaluation. Realist evaluation embraces principles of systems thinking by providing a method for understanding how elements of the system interact with one another in producing a given outcome. PMID:25121081

  10. Evaluation of capacity-building program of district health managers in India: a contextualized theoretical framework.

    PubMed

    Prashanth, N S; Marchal, Bruno; Kegels, Guy; Criel, Bart

    2014-01-01

    Performance of local health services managers at district level is crucial to ensure that health services are of good quality and cater to the health needs of the population in the area. In many low- and middle-income countries, health services managers are poorly equipped with public health management capacities needed for planning and managing their local health system. In the south Indian Tumkur district, a consortium of five non-governmental organizations partnered with the state government to organize a capacity-building program for health managers. The program consisted of a mix of periodic contact classes, mentoring and assignments and was spread over 30 months. In this paper, we develop a theoretical framework in the form of a refined program theory to understand how such a capacity-building program could bring about organizational change. A well-formulated program theory enables an understanding of how interventions could bring about improvements and an evaluation of the intervention. In the refined program theory of the intervention, we identified various factors at individual, institutional, and environmental levels that could interact with the hypothesized mechanisms of organizational change, such as staff's perceived self-efficacy and commitment to their organizations. Based on this program theory, we formulated context-mechanism-outcome configurations that can be used to evaluate the intervention and, more specifically, to understand what worked, for whom and under what conditions. We discuss the application of program theory development in conducting a realist evaluation. Realist evaluation embraces principles of systems thinking by providing a method for understanding how elements of the system interact with one another in producing a given outcome.

  11. Using frameworks to diagram value in complex policy and environmental interventions to prevent childhood obesity.

    PubMed

    Swank, Melissa Farrell; Brennan, Laura K; Gentry, Daniel; Kemner, Allison L

    2015-01-01

    To date, few tools assist policy makers and practitioners in understanding and conveying the implementation costs, potential impacts, and value of policy and environmental changes to address healthy eating, active living, and childhood obesity. For the Evaluation of Healthy Kids, Healthy Communities (HKHC), evaluators considered inputs (resources and investments) that generate costs and savings as well as benefits and harms related to social, economic, environmental, and health-related outcomes in their assessment of 49 HKHC community partnerships funded from 2009 to 2014. Using data collected through individual and group interviews and an online performance monitoring system, evaluators created a socioecological framework to assess investments, resources, costs, savings, benefits, and harms at the individual, organizational, community, and societal levels. Evaluators customized frameworks for 6 focal strategies: active transportation, parks and play spaces, child care physical activity standards, corner stores, farmers' markets, and child care nutrition standards. To illustrate the Value Frameworks, this brief highlights the 38 HKHC communities implementing at least 1 active transportation strategy. Evaluators populated this conceptual Value Framework with themes from the strategy-specific inputs and outputs. The range of factors corresponding to the implementation and impact of the HKHC community partnerships are highlighted along with the inputs and outputs. The Value Frameworks helped evaluators identify gaps in current analysis models (ie, benefit-cost analysis, cost-effectiveness analysis) as well as paint a more complete picture of value for potential obesity prevention strategies. These frameworks provide a comprehensive understanding of investments needed, proposed costs and savings, and potential benefits and harms associated with economic, social, environmental, and health outcomes. This framing also allowed evaluators to demonstrate the interdependence of each socioecological level on the others in these multicomponent interventions. This model can be used by practitioners and community leaders to assess realistic and sustainable strategies to combat childhood obesity.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieder, William R.; Allison, Steven D.; Davidson, Eric A.

    Microbes influence soil organic matter (SOM) decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) may make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here, we review the diversity, advantages, and pitfalls of simulating soilmore » biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models we suggest: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.« less

  13. An Investigation of the Impact of Guessing on Coefficient α and Reliability

    PubMed Central

    2014-01-01

    Guessing is known to influence the test reliability of multiple-choice tests. Although there are many studies that have examined the impact of guessing, they used rather restrictive assumptions (e.g., parallel test assumptions, homogeneous inter-item correlations, homogeneous item difficulty, and homogeneous guessing levels across items) to evaluate the relation between guessing and test reliability. Based on the item response theory (IRT) framework, this study investigated the extent of the impact of guessing on reliability under more realistic conditions where item difficulty, item discrimination, and guessing levels actually vary across items with three different test lengths (TL). By accommodating multiple item characteristics simultaneously, this study also focused on examining interaction effects between guessing and other variables entered in the simulation to be more realistic. The simulation of the more realistic conditions and calculations of reliability and classical test theory (CTT) item statistics were facilitated by expressing CTT item statistics, coefficient α, and reliability in terms of IRT model parameters. In addition to the general negative impact of guessing on reliability, results showed interaction effects between TL and guessing and between guessing and test difficulty.

  14. Microgravity isolation system design: A modern control analysis framework

    NASA Technical Reports Server (NTRS)

    Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.

    1994-01-01

    Many acceleration-sensitive, microgravity science experiments will require active vibration isolation from the manned orbiters on which they will be mounted. The isolation problem, especially in the case of a tethered payload, is a complex three-dimensional one that is best suited to modern-control design methods. These methods, although more powerful than their classical counterparts, can nonetheless go only so far in meeting the design requirements for practical systems. Once a tentative controller design is available, it must still be evaluated to determine whether or not it is fully acceptable, and to compare it with other possible design candidates. Realistically, such evaluation will be an inherent part of a necessary iterative design process. In this paper, an approach is presented for applying complex mu-analysis methods to a closed-loop vibration isolation system (experiment plus controller). An analysis framework is presented for evaluating nominal stability, nominal performance, robust stability, and robust performance of active microgravity isolation systems, with emphasis on the effective use of mu-analysis methods.

  15. GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments.

    PubMed

    Monroy, Javier; Hernandez-Bennets, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier

    2017-06-23

    This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment.

  16. GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments

    PubMed Central

    Hernandez-Bennetts, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier

    2017-01-01

    This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment. PMID:28644375

  17. A discrete mechanics framework for real time virtual surgical simulations with application to virtual laparoscopic nephrectomy.

    PubMed

    Zhou, Xiangmin; Zhang, Nan; Sha, Desong; Shen, Yunhe; Tamma, Kumar K; Sweet, Robert

    2009-01-01

    The inability to render realistic soft-tissue behavior in real time has remained a barrier to face and content aspects of validity for many virtual reality surgical training systems. Biophysically based models are not only suitable for training purposes but also for patient-specific clinical applications, physiological modeling and surgical planning. When considering the existing approaches for modeling soft tissue for virtual reality surgical simulation, the computer graphics-based approach lacks predictive capability; the mass-spring model (MSM) based approach lacks biophysically realistic soft-tissue dynamic behavior; and the finite element method (FEM) approaches fail to meet the real-time requirement. The present development stems from physics fundamental thermodynamic first law; for a space discrete dynamic system directly formulates the space discrete but time continuous governing equation with embedded material constitutive relation and results in a discrete mechanics framework which possesses a unique balance between the computational efforts and the physically realistic soft-tissue dynamic behavior. We describe the development of the discrete mechanics framework with focused attention towards a virtual laparoscopic nephrectomy application.

  18. Cognitive simulators for medical education and training.

    PubMed

    Kahol, Kanav; Vankipuram, Mithra; Smith, Marshall L

    2009-08-01

    Simulators for honing procedural skills (such as surgical skills and central venous catheter placement) have proven to be valuable tools for medical educators and students. While such simulations represent an effective paradigm in surgical education, there is an opportunity to add a layer of cognitive exercises to these basic simulations that can facilitate robust skill learning in residents. This paper describes a controlled methodology, inspired by neuropsychological assessment tasks and embodied cognition, to develop cognitive simulators for laparoscopic surgery. These simulators provide psychomotor skill training and offer the additional challenge of accomplishing cognitive tasks in realistic environments. A generic framework for design, development and evaluation of such simulators is described. The presented framework is generalizable and can be applied to different task domains. It is independent of the types of sensors, simulation environment and feedback mechanisms that the simulators use. A proof of concept of the framework is provided through developing a simulator that includes cognitive variations to a basic psychomotor task. The results of two pilot studies are presented that show the validity of the methodology in providing an effective evaluation and learning environments for surgeons.

  19. Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework

    PubMed Central

    Kroes, Thomas; Post, Frits H.; Botha, Charl P.

    2012-01-01

    The field of volume visualization has undergone rapid development during the past years, both due to advances in suitable computing hardware and due to the increasing availability of large volume datasets. Recent work has focused on increasing the visual realism in Direct Volume Rendering (DVR) by integrating a number of visually plausible but often effect-specific rendering techniques, for instance modeling of light occlusion and depth of field. Besides yielding more attractive renderings, especially the more realistic lighting has a positive effect on perceptual tasks. Although these new rendering techniques yield impressive results, they exhibit limitations in terms of their exibility and their performance. Monte Carlo ray tracing (MCRT), coupled with physically based light transport, is the de-facto standard for synthesizing highly realistic images in the graphics domain, although usually not from volumetric data. Due to the stochastic sampling of MCRT algorithms, numerous effects can be achieved in a relatively straight-forward fashion. For this reason, we have developed a practical framework that applies MCRT techniques also to direct volume rendering (DVR). With this work, we demonstrate that a host of realistic effects, including physically based lighting, can be simulated in a generic and flexible fashion, leading to interactive DVR with improved realism. In the hope that this improved approach to DVR will see more use in practice, we have made available our framework under a permissive open source license. PMID:22768292

  20. Market-Based Coordination of Thermostatically Controlled Loads—Part I: A Mechanism Design Formulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Sen; Zhang, Wei; Lian, Jianming

    This paper focuses on the coordination of a population of Thermostatically Controlled Loads (TCLs) with unknown parameters to achieve group objectives. The problem involves designing the bidding and market clearing strategy to motivate self-interested users to realize efficient energy allocation subject to a peak power constraint. Using the mechanism design approach, we propose a market-based coordination framework, which can effectively incorporate heterogeneous load dynamics, systematically deal with user preferences, account for the unknown load model parameters, and enable the real-world implementation with limited communication resources. This paper is divided into two parts. Part I presents a mathematical formulation of themore » problem and develops a coordination framework using the mechanism design approach. Part II presents a learning scheme to account for the unknown load model parameters, and evaluates the proposed framework through realistic simulations.« less

  1. Use of clinical guidelines in remote Australia: A realist evaluation.

    PubMed

    Reddy, Sandeep; Orpin, Victoria; Herring, Sally; Mackie-Schneider, Stephanie; Struber, Janet

    2018-02-01

    The aim of this evaluation was to assess the acceptability, accessibility, and compliance with the 2014 editions of the Remote Primary Health Care Manuals (RPHCM) in health care centres across remote areas of Northern and Central Australia. To undertake a comprehensive evaluation that considered context, the evaluation used a realist evaluation framework. The evaluation used a variety of methods including interviews and survey to develop and test a programme theory. Many remote health practitioners have adopted standardized, evidence-based practice because of the use of the RPHCM. The mechanisms that led to the use of the manuals include acceptance of the worth of the protocols to their clinical practice, reliance on manual content to guide their practice, the perception of credibility, the applicability of RPHCM content to the context, and a fear of the consequences of not using the RPHCMs. Some remote health practitioners are less inclined to use the RPHCM regularly because of a perception that the content is less suited to their needs and daily practice or it is hard to navigate or understand. The evaluation concluded that there is work to be done to widen the RPHCM user base, and organizations need to increase support for their staff to use the RPHCM protocols better. These measures are expected to enable standardized clinical practice in the remote context. © 2017 John Wiley & Sons, Ltd.

  2. Dealing with Complex Causality in Realist Synthesis: The Promise of Qualitative Comparative Analysis

    ERIC Educational Resources Information Center

    Sager, Fritz; Andereggen, Celine

    2012-01-01

    In this article, the authors state two arguments: first, that the four categories of context, politics, polity, and policy make an adequate framework for systematic review being both exhaustive and parsimonious; second, that the method of qualitative comparative analysis (QCA) is an appropriate methodical approach for gaining realistic results…

  3. SPH simulations of WBC adhesion to the endothelium: the role of haemodynamics and endothelial binding kinetics.

    PubMed

    Gholami, Babak; Comerford, Andrew; Ellero, Marco

    2015-11-01

    A multiscale Lagrangian particle solver introduced in our previous work is extended to model physiologically realistic near-wall cell dynamics. Three-dimensional simulation of particle trajectories is combined with realistic receptor-ligand adhesion behaviour to cover full cell interactions in the vicinity of the endothelium. The selected stochastic adhesion model, which is based on a Monte Carlo acceptance-rejection method, fits in our Lagrangian framework and does not compromise performance. Additionally, appropriate inflow/outflow boundary conditions are implemented for our SPH solver to enable realistic pulsatile flow simulation. The model is tested against in-vitro data from a 3D geometry with a stenosis and sudden expansion. In both steady and pulsatile flow conditions, results show close agreement with the experimental ones. Furthermore we demonstrate, in agreement with experimental observations, that haemodynamics alone does not account for adhesion of white blood cells, in this case U937 monocytic human cells. Our findings suggest that the current framework is fully capable of modelling cell dynamics in large arteries in a realistic and efficient manner.

  4. Exploring knowledge exchange: a useful framework for practice and policy.

    PubMed

    Ward, Vicky; Smith, Simon; House, Allan; Hamer, Susan

    2012-02-01

    Knowledge translation is underpinned by a dynamic and social knowledge exchange process but there are few descriptions of how this unfolds in practice settings. This has hampered attempts to produce realistic and useful models to help policymakers and researchers understand how knowledge exchange works. This paper reports the results of research which investigated the nature of knowledge exchange. We aimed to understand whether dynamic and fluid definitions of knowledge exchange are valid and to produce a realistic, descriptive framework of knowledge exchange. Our research was informed by a realist approach. We embedded a knowledge broker within three service delivery teams across a mental health organisation in the UK, each of whom was grappling with specific challenges. The knowledge broker participated in the team's problem-solving process and collected observational fieldnotes. We also interviewed the team members. Observational and interview data were analysed quantitatively and qualitatively in order to determine and describe the nature of the knowledge exchange process in more detail. This enabled us to refine our conceptual framework of knowledge exchange. We found that knowledge exchange can be understood as a dynamic and fluid process which incorporates distinct forms of knowledge from multiple sources. Quantitative analysis illustrated that five broadly-defined components of knowledge exchange (problem, context, knowledge, activities, use) can all be in play at any one time and do not occur in a set order. Qualitative analysis revealed a number of distinct themes which better described the nature of knowledge exchange. By shedding light on the nature of knowledge exchange, our findings problematise some of the linear, technicist approaches to knowledge translation. The revised model of knowledge exchange which we propose here could therefore help to reorient thinking about knowledge exchange and act as a starting point for further exploration and evaluation of the knowledge exchange process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Community engagement to enhance trust between Gypsy/Travellers, and maternity, early years' and child dental health services: protocol for a multi-method exploratory study.

    PubMed

    McFadden, Alison; Atkin, Karl; Bell, Kerry; Innes, Nicola; Jackson, Cath; Jones, Helen; MacGillivray, Steve; Siebelt, Lindsay

    2016-11-14

    Gypsy/Travellers have poor health and experience discrimination alongside structural and cultural barriers when accessing health services and consequently may mistrust those services. Our study aims to investigate which approaches to community engagement are most likely to be effective at enhancing trust between Gypsy/Travellers and mainstream health services. This multi-method 30-month study, commenced in June 2015, and comprises four stages. 1. Three related reviews: a) systematic review of Gypsy/Travellers' access to health services; b) systematic review of reviews of how trust has been conceptualised within healthcare; c) realist synthesis of community engagement approaches to enhance trust and increase Gypsy/Travellers' participation in health services. The reviews will consider any economic literature; 2. Online consultation with health and social care practitioners, and civil society organisations on existing engagement activities, including perceptions of barriers and good practice; 3. Four in-depth case studies of different Gypsy/Traveller communities, focusing on maternity, early years and child dental health services. The case studies include the views of 32-48 mothers of pre-school children, 32-40 healthcare providers and 8-12 informants from third sector organisations. 4. Two stakeholder workshops exploring whether policy options are realistic, sustainable and replicable. Case study data will be analysed thematically informed by the evaluative framework derived from the realist synthesis in stage one. The main outputs will be: a) an evaluative framework of Gypsy/Travellers' engagement with health services; b) recommendations for policy and practice; c) evidence on which to base future implementation strategies including estimation of costs. Our novel multi-method study seeks to provide recommendations for policy and practice that have potential to improve uptake and delivery of health services, and to reduce lifetime health inequalities for Gypsy/Travellers. The findings may have wider resonance for other marginalised populations. Strengths and limitations of the study are discussed. Prospero registration for literature reviews: CRD42015021955 and CRD42015021950 UKCRN reference: 20036.

  6. Meshless Modeling of Deformable Shapes and their Motion

    PubMed Central

    Adams, Bart; Ovsjanikov, Maks; Wand, Michael; Seidel, Hans-Peter; Guibas, Leonidas J.

    2010-01-01

    We present a new framework for interactive shape deformation modeling and key frame interpolation based on a meshless finite element formulation. Starting from a coarse nodal sampling of an object’s volume, we formulate rigidity and volume preservation constraints that are enforced to yield realistic shape deformations at interactive frame rates. Additionally, by specifying key frame poses of the deforming shape and optimizing the nodal displacements while targeting smooth interpolated motion, our algorithm extends to a motion planning framework for deformable objects. This allows reconstructing smooth and plausible deformable shape trajectories in the presence of possibly moving obstacles. The presented results illustrate that our framework can handle complex shapes at interactive rates and hence is a valuable tool for animators to realistically and efficiently model and interpolate deforming 3D shapes. PMID:24839614

  7. Performance evaluation of an automatic MGRF-based lung segmentation approach

    NASA Astrophysics Data System (ADS)

    Soliman, Ahmed; Khalifa, Fahmi; Alansary, Amir; Gimel'farb, Georgy; El-Baz, Ayman

    2013-10-01

    The segmentation of the lung tissues in chest Computed Tomography (CT) images is an important step for developing any Computer-Aided Diagnostic (CAD) system for lung cancer and other pulmonary diseases. In this paper, we introduce a new framework for validating the accuracy of our developed Joint Markov-Gibbs based lung segmentation approach using 3D realistic synthetic phantoms. These phantoms are created using a 3D Generalized Gauss-Markov Random Field (GGMRF) model of voxel intensities with pairwise interaction to model the 3D appearance of the lung tissues. Then, the appearance of the generated 3D phantoms is simulated based on iterative minimization of an energy function that is based on the learned 3D-GGMRF image model. These 3D realistic phantoms can be used to evaluate the performance of any lung segmentation approach. The performance of our segmentation approach is evaluated using three metrics, namely, the Dice Similarity Coefficient (DSC), the modified Hausdorff distance, and the Average Volume Difference (AVD) between our segmentation and the ground truth. Our approach achieves mean values of 0.994±0.003, 8.844±2.495 mm, and 0.784±0.912 mm3, for the DSC, Hausdorff distance, and the AVD, respectively.

  8. Method for decreasing CT simulation time of complex phantoms and systems through separation of material specific projection data

    NASA Astrophysics Data System (ADS)

    Divel, Sarah E.; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.

    2017-03-01

    Computer simulation is a powerful tool in CT; however, long simulation times of complex phantoms and systems, especially when modeling many physical aspects (e.g., spectrum, finite detector and source size), hinder the ability to realistically and efficiently evaluate and optimize CT techniques. Long simulation times primarily result from the tracing of hundreds of line integrals through each of the hundreds of geometrical shapes defined within the phantom. However, when the goal is to perform dynamic simulations or test many scan protocols using a particular phantom, traditional simulation methods inefficiently and repeatedly calculate line integrals through the same set of structures although only a few parameters change in each new case. In this work, we have developed a new simulation framework that overcomes such inefficiencies by dividing the phantom into material specific regions with the same time attenuation profiles, acquiring and storing monoenergetic projections of the regions, and subsequently scaling and combining the projections to create equivalent polyenergetic sinograms. The simulation framework is especially efficient for the validation and optimization of CT perfusion which requires analysis of many stroke cases and testing hundreds of scan protocols on a realistic and complex numerical brain phantom. Using this updated framework to conduct a 31-time point simulation with 80 mm of z-coverage of a brain phantom on two 16-core Linux serves, we have reduced the simulation time from 62 hours to under 2.6 hours, a 95% reduction.

  9. Review: evaluating information systems in nursing.

    PubMed

    Oroviogoicoechea, Cristina; Elliott, Barbara; Watson, Roger

    2008-03-01

    To review existing nursing research on inpatient hospitals' information technology (IT) systems in order to explore new approaches for evaluation research on nursing informatics to guide further design and implementation of effective IT systems. There has been an increase in the use of IT and information systems in nursing in recent years. However, there has been little evaluation of these systems and little guidance on how they might be evaluated. A literature review was conducted between 1995 and 2005 inclusive using CINAHL and Medline and the search terms 'nursing information systems', 'clinical information systems', 'hospital information systems', 'documentation', 'nursing records', 'charting'. Research in nursing information systems was analysed and some deficiencies and contradictory results were identified which impede a comprehensive understanding of effective implementation. There is a need for IT systems to be understood from a wider perspective that includes aspects related to the context where they are implemented. Social and organizational aspects need to be considered in evaluation studies and realistic evaluation can provide a framework for the evaluation of information systems in nursing. The rapid introduction of IT systems for clinical practice urges evaluation of already implemented systems examining how and in what circumstances they work to guide effective further development and implementation of IT systems to enhance clinical practice. Evaluation involves more factors than just involving technologies such as changing attitudes, cultures and healthcare practices. Realistic evaluation could provide configurations of context-mechanism-outcomes that explain the underlying relationships to understand why and how a programme or intervention works.

  10. Visualizing context through theory deconstruction: a content analysis of three bodies of evaluation theory literature.

    PubMed

    Vo, Anne T

    2013-06-01

    While the evaluation field collectively agrees that contextual factors bear on evaluation practice and related scholarly endeavors, the discipline does not yet have an explicit framework for understanding evaluation context. To address this gap in the knowledge base, this paper explores the ways in which evaluation context has been addressed in the practical-participatory, values-engaged, and emergent realist evaluation literatures. Five primary dimensions that constitute evaluation context were identified for this purpose: (1) stakeholder; (2) program; (3) organization; (4) historical/political; and (5) evaluator. Journal articles, book chapters, and conference papers rooted in the selected evaluation approaches were compared along these dimensions in order to explore points of convergence and divergence in the theories. Study results suggest that the selected prescriptive theories most clearly explicate stakeholder and evaluator contexts. Programmatic, organizational, and historical/political contexts, on the other hand, require further clarification. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Assessing public health policy approaches to level-up the gradient in health inequalities: the Gradient Evaluation Framework.

    PubMed

    Davies, J K; Sherriff, N S

    2014-03-01

    This paper seeks to introduce and analyse the development of the Gradient Evaluation Framework (GEF) to facilitate evaluation of policy actions for their current or future use in terms of their 'gradient friendliness'. In particular, this means their potential to level-up the gradient in health inequalities by addressing the social determinants of health and thereby reducing decision-makers' chances of error when developing such policy actions. A qualitative developmental study to produce a policy-based evaluation framework. The scientific basis of GEF was developed using a comprehensive consensus-building process. This process followed an initial narrative review, based on realist review principles, which highlighted the need for production of a dedicated evaluation framework. The consensus-building process included expert workshops, a pretesting phase, and external peer review, together with support from the Gradient project Scientific Advisory Group and all Gradient project partners, including its Project Steering Committee. GEF is presented as a flexible policy tool resulting from a consensus-building process involving experts from 13 European countries. The theoretical foundations which underpin GEF are discussed, together with a range of practical challenges. The importance of systematic evaluation at each stage of the policy development and implementation cycle is highlighted, as well as the socio-political context in which policy actions are located. GEF offers potentially a major contribution to the public health field in the form of a practical, policy-relevant and common frame of reference for the evaluation of public health interventions that aim to level-up the social gradient in health inequalities. Further research, including the need for practical field testing of GEF and the exploration of alternative presentational formats, is recommended. Copyright © 2013 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  12. Evaluation of counterfactuality in counterfactual communication protocols

    NASA Astrophysics Data System (ADS)

    Arvidsson-Shukur, D. R. M.; Barnes, C. H. W.; Gottfries, A. N. O.

    2017-12-01

    We provide an in-depth investigation of parameter estimation in nested Mach-Zehnder interferometers (NMZIs) using two information measures: the Fisher information and the Shannon mutual information. Protocols for counterfactual communication have, so far, been based on two different definitions of counterfactuality. In particular, some schemes have been based on NMZI devices, and have recently been subject to criticism. We provide a methodology for evaluating the counterfactuality of these protocols, based on an information-theoretical framework. More specifically, we make the assumption that any realistic quantum channel in MZI structures will have some weak uncontrolled interaction. We then use the Fisher information of this interaction to measure counterfactual violations. The measure is used to evaluate the suggested counterfactual communication protocol of H. Salih et al. [Phys. Rev. Lett. 110, 170502 (2013), 10.1103/PhysRevLett.110.170502]. The protocol of D. R. M. Arvidsson-Shukur and C. H. W. Barnes [Phys. Rev. A 94, 062303 (2016), 10.1103/PhysRevA.94.062303], based on a different definition, is evaluated with a probability measure. Our results show that the definition of Arvidsson-Shukur and Barnes is satisfied by their scheme, while that of Salih et al. is only satisfied by perfect quantum channels. For realistic devices the latter protocol does not achieve its objective.

  13. How to use programme theory to evaluate the effectiveness of schemes designed to improve the work environment in small businesses.

    PubMed

    Olsen, Kirsten; Legg, Stephen; Hasle, Peter

    2012-01-01

    Due to the many constraints that small businesses (SBs) face in meeting legislative requirements, occupational health and safety (OHS) regulatory authorities and other OSH actors have developed programmes which can reach out to SBs and motivate and assist them in improving the work environment. A number of conceptual models help to enhance our understanding of OHS interventions in SBs and their effectiveness. However, they have mainly been evaluated on output rather than the process relating to the change theory underlying the intervention, and hence have seldom been rigorously evaluated. Thus little is known about how particular features of SBs can be taken into account when designing and implementing national programmes. This paper shows how realist analysis and programme theory may be used as a framework for evaluating, developing and improving national intervention programmes for the improvement of the work environment and reducing injuries in SBs. It illustrates this for a specific New Zealand intervention: the Workplace Safety Discount scheme and its implementation in the agriculture sector. In practice, realist analysis should be performed during the planning, implementation and management stages so that ongoing findings can be fed back to the participant social actors to help them make appropriate changes to enhance the likelihood of success.

  14. Virtual Reality Simulation Training for Ebola Deployment.

    PubMed

    Ragazzoni, Luca; Ingrassia, Pier Luigi; Echeverri, Lina; Maccapani, Fabio; Berryman, Lizzy; Burkle, Frederick M; Della Corte, Francesco

    2015-10-01

    Both virtual and hybrid simulation training offer a realistic and effective educational framework and opportunity to provide virtual exposure to operational public health skills that are essential for infection control and Ebola treatment management. This training is designed to increase staff safety and create a safe and realistic environment where trainees can gain essential basic and advanced skills.

  15. Compatible quantum theory

    NASA Astrophysics Data System (ADS)

    Friedberg, R.; Hohenberg, P. C.

    2014-09-01

    Formulations of quantum mechanics (QM) can be characterized as realistic, operationalist, or a combination of the two. In this paper a realistic theory is defined as describing a closed system entirely by means of entities and concepts pertaining to the system. An operationalist theory, on the other hand, requires in addition entities external to the system. A realistic formulation comprises an ontology, the set of (mathematical) entities that describe the system, and assertions, the set of correct statements (predictions) the theory makes about the objects in the ontology. Classical mechanics is the prime example of a realistic physical theory. A straightforward generalization of classical mechanics to QM is hampered by the inconsistency of quantum properties with classical logic, a circumstance that was noted many years ago by Birkhoff and von Neumann. The present realistic formulation of the histories approach originally introduced by Griffiths, which we call ‘compatible quantum theory (CQT)’, consists of a ‘microscopic’ part (MIQM), which applies to a closed quantum system of any size, and a ‘macroscopic’ part (MAQM), which requires the participation of a large (ideally, an infinite) system. The first (MIQM) can be fully formulated based solely on the assumption of a Hilbert space ontology and the noncontextuality of probability values, relying in an essential way on Gleason's theorem and on an application to dynamics due in large part to Nistico. Thus, the present formulation, in contrast to earlier ones, derives the Born probability formulas and the consistency (decoherence) conditions for frameworks. The microscopic theory does not, however, possess a unique corpus of assertions, but rather a multiplicity of contextual truths (‘c-truths’), each one associated with a different framework. This circumstance leads us to consider the microscopic theory to be physically indeterminate and therefore incomplete, though logically coherent. The completion of the theory requires a macroscopic mechanism for selecting a physical framework, which is part of the macroscopic theory (MAQM). The selection of a physical framework involves the breaking of the microscopic ‘framework symmetry’, which can proceed either phenomenologically as in the standard quantum measurement theory, or more fundamentally by considering the quantum system under study to be a subsystem of a macroscopic quantum system. The decoherent histories formulation of Gell-Mann and Hartle, as well as that of Omnès, are theories of this fundamental type, where the physical framework is selected by a coarse-graining procedure in which the physical phenomenon of decoherence plays an essential role. Various well-known interpretations of QM are described from the perspective of CQT. Detailed definitions and proofs are presented in the appendices.

  16. What works in 'real life' to facilitate home deaths and fewer hospital admissions for those at end of life?: results from a realist evaluation of new palliative care services in two English counties.

    PubMed

    Wye, Lesley; Lasseter, Gemma; Percival, John; Duncan, Lorna; Simmonds, Bethany; Purdy, Sarah

    2014-01-01

    WE EVALUATED END OF LIFE CARE SERVICES IN TWO ENGLISH COUNTIES INCLUDING: coordination centres, telephone advice line, 'Discharge in Reach' nurses, a specialist community personal care team and community nurse educators. Elsewhere, we published findings detailing high family carer satisfaction and fewer hospital admissions, Accident and Emergency attendances and hospital deaths for service users compared to controls. The aim of this paper is to discuss what contributed to those outcomes. Using realist evaluation, data collection included documentation (e.g. referral databases), 15 observations of services and interviews with 43 family carers and 105 professionals. Data were analysed using framework analysis, applying realist evaluation concepts. Findings were discussed at successive team meetings and further data was collected until team consensus was reached. Services 'worked' primarily for those with cancer with 'fast track' funding who were close to death. Factors contributing to success included services staffed with experienced palliative care professionals with dedicated (and sufficient) time for difficult conversations with family carers, patients and/or clinical colleagues about death and the practicalities of caring for the dying. Using their formal and informal knowledge of the local healthcare system, they accessed community resources to support homecare and delivered excellent services. This engendered confidence and reassurance for staff, family carers and patients, possibly contributing to less hospital admissions and A&E attendances and more home deaths. With demand for 24-hour end of life care growing and care provision fragmented across health and social care boundaries, services like these that cut across organisational sectors may become more important. They offer an overview to help navigate those desiring a home death through the system.

  17. Smsynth: AN Imagery Synthesis System for Soil Moisture Retrieval

    NASA Astrophysics Data System (ADS)

    Cao, Y.; Xu, L.; Peng, J.

    2018-04-01

    Soil moisture (SM) is a important variable in various research areas, such as weather and climate forecasting, agriculture, drought and flood monitoring and prediction, and human health. An ongoing challenge in estimating SM via synthetic aperture radar (SAR) is the development of the retrieval SM methods, especially the empirical models needs as training samples a lot of measurements of SM and soil roughness parameters which are very difficult to acquire. As such, it is difficult to develop empirical models using realistic SAR imagery and it is necessary to develop methods to synthesis SAR imagery. To tackle this issue, a SAR imagery synthesis system based on the SM named SMSynth is presented, which can simulate radar signals that are realistic as far as possible to the real SAR imagery. In SMSynth, SAR backscatter coefficients for each soil type are simulated via the Oh model under the Bayesian framework, where the spatial correlation is modeled by the Markov random field (MRF) model. The backscattering coefficients simulated based on the designed soil parameters and sensor parameters are added into the Bayesian framework through the data likelihood where the soil parameters and sensor parameters are set as realistic as possible to the circumstances on the ground and in the validity range of the Oh model. In this way, a complete and coherent Bayesian probabilistic framework is established. Experimental results show that SMSynth is capable of generating realistic SAR images that suit the needs of a large amount of training samples of empirical models.

  18. On Maximizing the Lifetime of Wireless Sensor Networks by Optimally Assigning Energy Supplies

    PubMed Central

    Asorey-Cacheda, Rafael; García-Sánchez, Antonio Javier; García-Sánchez, Felipe; García-Haro, Joan; Gonzalez-Castaño, Francisco Javier

    2013-01-01

    The extension of the network lifetime of Wireless Sensor Networks (WSN) is an important issue that has not been appropriately solved yet. This paper addresses this concern and proposes some techniques to plan an arbitrary WSN. To this end, we suggest a hierarchical network architecture, similar to realistic scenarios, where nodes with renewable energy sources (denoted as primary nodes) carry out most message delivery tasks, and nodes equipped with conventional chemical batteries (denoted as secondary nodes) are those with less communication demands. The key design issue of this network architecture is the development of a new optimization framework to calculate the optimal assignment of renewable energy supplies (primary node assignment) to maximize network lifetime, obtaining the minimum number of energy supplies and their node assignment. We also conduct a second optimization step to additionally minimize the number of packet hops between the source and the sink. In this work, we present an algorithm that approaches the results of the optimization framework, but with much faster execution speed, which is a good alternative for large-scale WSN networks. Finally, the network model, the optimization process and the designed algorithm are further evaluated and validated by means of computer simulation under realistic conditions. The results obtained are discussed comparatively. PMID:23939582

  19. On maximizing the lifetime of Wireless Sensor Networks by optimally assigning energy supplies.

    PubMed

    Asorey-Cacheda, Rafael; García-Sánchez, Antonio Javier; García-Sánchez, Felipe; García-Haro, Joan; González-Castano, Francisco Javier

    2013-08-09

    The extension of the network lifetime of Wireless Sensor Networks (WSN) is an important issue that has not been appropriately solved yet. This paper addresses this concern and proposes some techniques to plan an arbitrary WSN. To this end, we suggest a hierarchical network architecture, similar to realistic scenarios, where nodes with renewable energy sources (denoted as primary nodes) carry out most message delivery tasks, and nodes equipped with conventional chemical batteries (denoted as secondary nodes) are those with less communication demands. The key design issue of this network architecture is the development of a new optimization framework to calculate the optimal assignment of renewable energy supplies (primary node assignment) to maximize network lifetime, obtaining the minimum number of energy supplies and their node assignment. We also conduct a second optimization step to additionally minimize the number of packet hops between the source and the sink. In this work, we present an algorithm that approaches the results of the optimization framework, but with much faster execution speed, which is a good alternative for large-scale WSN networks. Finally, the network model, the optimization process and the designed algorithm are further evaluated and validated by means of computer simulation under realistic conditions. The results obtained are discussed comparatively.

  20. Teaching Poor Ethnic Minority Students: A Critical Realist Interpretation of Disempowerment

    ERIC Educational Resources Information Center

    Stylianou, Areti; Scott, David

    2018-01-01

    This article aims to supplement the literature on the role of school context with regards to the disempowerment of teachers in their work with poor ethnic minority students. We use a critical realist framework to analyse the empirical data collected for an in-depth school case study and we suggest the existence of real, interrelated, emergent and…

  1. Digital evaluation of sitting posture comfort in human-vehicle system under Industry 4.0 framework

    NASA Astrophysics Data System (ADS)

    Tao, Qing; Kang, Jinsheng; Sun, Wenlei; Li, Zhaobo; Huo, Xiao

    2016-09-01

    Most of the previous studies on the vibration ride comfort of the human-vehicle system were focused only on one or two aspects of the investigation. A hybrid approach which integrates all kinds of investigation methods in real environment and virtual environment is described. The real experimental environment includes the WBV(whole body vibration) test, questionnaires for human subjective sensation and motion capture. The virtual experimental environment includes the theoretical calculation on simplified 5-DOF human body vibration model, the vibration simulation and analysis within ADAMS/VibrationTM module, and the digital human biomechanics and occupational health analysis in Jack software. While the real experimental environment provides realistic and accurate test results, it also serves as core and validation for the virtual experimental environment. The virtual experimental environment takes full advantages of current available vibration simulation and digital human modelling software, and makes it possible to evaluate the sitting posture comfort in a human-vehicle system with various human anthropometric parameters. How this digital evaluation system for car seat comfort design is fitted in the Industry 4.0 framework is also proposed.

  2. Formulating a Theoretical Framework for Assessing Network Loads for Effective Deployment in Network-Centric Operations and Warfare

    DTIC Science & Technology

    2008-11-01

    is particularly important in order to design a network that is realistically deployable. The goal of this project is the design of a theoretical ... framework to assess and predict the effectiveness and performance of networks and their loads.

  3. Modes of Hoping: Understanding hope and expectation in the context of a clinical trial of complementary and alternative medicine for chronic pain

    PubMed Central

    Eaves, Emery R; Ritenbaugh, Cheryl; Nichter, Mark; Hopkins, Allison L.; Sherman, Karen J

    2014-01-01

    This article explores the role of hope in participants’ assessments of their expectations, experiences and treatment outcomes. Data analysis focused on semi-structured, open-ended interviews with 44 participants, interviewed 3-5 times each over the course of a study evaluating Traditional Chinese Medicine (TCM) for Temporomandibular Disorders (TMD, a form of chronic orofacial pain). Transcripts were coded and analyzed using qualitative and ethnographic methods. A “Modes of Hoping”1 framework informed our analysis. Five modes of hoping emerged from participant narratives: Realistic Hope; Wishful Hope; Utopian Hope; Technoscience Hope; and Transcendent Hope. Using this framework, hope is demonstrated as exerting a profound influence over how participants assess and report their expectations. This suggests that researchers interested in measuring expectations and understanding their role in treatment outcomes should consider hope as exercising a multifaceted and dynamic influence on participants’ reporting of expectations and their experience and evaluation of treatment. PMID:25037665

  4. Modes of hoping: understanding hope and expectation in the context of a clinical trial of complementary and alternative medicine for chronic pain.

    PubMed

    Eaves, Emery R; Ritenbaugh, Cheryl; Nichter, Mark; Hopkins, Allison L; Sherman, Karen J

    2014-01-01

    This article explores the role of hope in participants' assessments of their expectations, experiences and treatment outcomes. Data analysis focused on semi-structured, open-ended interviews with 44 participants, interviewed 3-5 times each over the course of a study evaluating Traditional Chinese Medicine (TCM) for temporomandibular disorders (TMD), a form of chronic orofacial pain. Transcripts were coded and analyzed using qualitative and ethnographic methods. A "Modes of Hoping" (Webb, 2007)(1) framework informed our analysis. Five modes of hoping emerged from participant narratives: Realistic Hope, Wishful Hope, Utopian Hope, Technoscience Hope, and Transcendent Hope. Using this framework, hope is demonstrated as exerting a profound influence over how participants assess and report their expectations. This suggests that researchers interested in measuring expectations and understanding their role in treatment outcomes should consider hope as exercising a multi-faceted and dynamic influence on participants' reporting of expectations and their experience and evaluation of treatment. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Multilayer network decoding versatility and trust

    NASA Astrophysics Data System (ADS)

    Sarkar, Camellia; Yadav, Alok; Jalan, Sarika

    2016-01-01

    In the recent years, the multilayer networks have increasingly been realized as a more realistic framework to understand emergent physical phenomena in complex real-world systems. We analyze massive time-varying social data drawn from the largest film industry of the world under a multilayer network framework. The framework enables us to evaluate the versatility of actors, which turns out to be an intrinsic property of lead actors. Versatility in dimers suggests that working with different types of nodes are more beneficial than with similar ones. However, the triangles yield a different relation between type of co-actor and the success of lead nodes indicating the importance of higher-order motifs in understanding the properties of the underlying system. Furthermore, despite the degree-degree correlations of entire networks being neutral, multilayering picks up different values of correlation indicating positive connotations like trust, in the recent years. The analysis of weak ties of the industry uncovers nodes from a lower-degree regime being important in linking Bollywood clusters. The framework and the tools used herein may be used for unraveling the complexity of other real-world systems.

  6. Simulations and Evaluation of Mesoscale Convective Systems in a Multi-scale Modeling Framework (MMF)

    NASA Astrophysics Data System (ADS)

    Chern, J. D.; Tao, W. K.

    2017-12-01

    It is well known that the mesoscale convective systems (MCS) produce more than 50% of rainfall in most tropical regions and play important roles in regional and global water cycles. Simulation of MCSs in global and climate models is a very challenging problem. Typical MCSs have horizontal scale of a few hundred kilometers. Models with a domain of several hundred kilometers and fine enough resolution to properly simulate individual clouds are required to realistically simulate MCSs. The multiscale modeling framework (MMF), which replaces traditional cloud parameterizations with cloud-resolving models (CRMs) within a host atmospheric general circulation model (GCM), has shown some capabilities of simulating organized MCS-like storm signals and propagations. However, its embedded CRMs typically have small domain (less than 128 km) and coarse resolution ( 4 km) that cannot realistically simulate MCSs and individual clouds. In this study, a series of simulations were performed using the Goddard MMF. The impacts of the domain size and model grid resolution of the embedded CRMs on simulating MCSs are examined. The changes of cloud structure, occurrence, and properties such as cloud types, updraft and downdraft, latent heating profile, and cold pool strength in the embedded CRMs are examined in details. The simulated MCS characteristics are evaluated against satellite measurements using the Goddard Satellite Data Simulator Unit. The results indicate that embedded CRMs with large domain and fine resolution tend to produce better simulations compared to those simulations with typical MMF configuration (128 km domain size and 4 km model grid spacing).

  7. Chained Bell Inequality Experiment with High-Efficiency Measurements

    NASA Astrophysics Data System (ADS)

    Tan, T. R.; Wan, Y.; Erickson, S.; Bierhorst, P.; Kienzler, D.; Glancy, S.; Knill, E.; Leibfried, D.; Wineland, D. J.

    2017-03-01

    We report correlation measurements on two 9Be+ ions that violate a chained Bell inequality obeyed by any local-realistic theory. The correlations can be modeled as derived from a mixture of a local-realistic probabilistic distribution and a distribution that violates the inequality. A statistical framework is formulated to quantify the local-realistic fraction allowable in the observed distribution without the fair-sampling or independent-and-identical-distributions assumptions. We exclude models of our experiment whose local-realistic fraction is above 0.327 at the 95% confidence level. This bound is significantly lower than 0.586, the minimum fraction derived from a perfect Clauser-Horne-Shimony-Holt inequality experiment. Furthermore, our data provide a device-independent certification of the deterministically created Bell states.

  8. Game Theoretic Modeling of Water Resources Allocation Under Hydro-Climatic Uncertainty

    NASA Astrophysics Data System (ADS)

    Brown, C.; Lall, U.; Siegfried, T.

    2005-12-01

    Typical hydrologic and economic modeling approaches rely on assumptions of climate stationarity and economic conditions of ideal markets and rational decision-makers. In this study, we incorporate hydroclimatic variability with a game theoretic approach to simulate and evaluate common water allocation paradigms. Game Theory may be particularly appropriate for modeling water allocation decisions. First, a game theoretic approach allows economic analysis in situations where price theory doesn't apply, which is typically the case in water resources where markets are thin, players are few, and rules of exchange are highly constrained by legal or cultural traditions. Previous studies confirm that game theory is applicable to water resources decision problems, yet applications and modeling based on these principles is only rarely observed in the literature. Second, there are numerous existing theoretical and empirical studies of specific games and human behavior that may be applied in the development of predictive water allocation models. With this framework, one can evaluate alternative orderings and rules regarding the fraction of available water that one is allowed to appropriate. Specific attributes of the players involved in water resources management complicate the determination of solutions to game theory models. While an analytical approach will be useful for providing general insights, the variety of preference structures of individual players in a realistic water scenario will likely require a simulation approach. We propose a simulation approach incorporating the rationality, self-interest and equilibrium concepts of game theory with an agent-based modeling framework that allows the distinct properties of each player to be expressed and allows the performance of the system to manifest the integrative effect of these factors. Underlying this framework, we apply a realistic representation of spatio-temporal hydrologic variability and incorporate the impact of decision-making a priori to hydrologic realizations and those made a posteriori on alternative allocation mechanisms. Outcomes are evaluated in terms of water productivity, net social benefit and equity. The performance of hydro-climate prediction modeling in each allocation mechanism will be assessed. Finally, year-to-year system performance and feedback pathways are explored. In this way, the system can be adaptively managed toward equitable and efficient water use.

  9. Effect of stigma reduction intervention strategies on HIV test uptake in low- and middle-income countries: a realist review protocol.

    PubMed

    Thapa, Subash; Hannes, Karin; Cargo, Margaret; Buve, Anne; Mathei, Catharina

    2015-11-02

    Several stigma reduction intervention strategies have been developed and tested for effectiveness in terms of increasing human immunodeficiency virus (HIV) test uptake. These strategies have been more effective in some contexts and less effective in others. Individual factors, such as lack of knowledge and fear of disclosure, and social-contextual factors, such as poverty and illiteracy, might influence the effect of stigma reduction intervention strategies on HIV test uptake in low- and middle-income countries. So far, it is not clearly known how the stigma reduction intervention strategies interact with these contextual factors to increase HIV test uptake. Therefore, we will conduct a review that will synthesize existing studies on stigma reduction intervention strategies to increase HIV test uptake to better understand the mechanisms underlying this process in low- and middle-income countries. A realist review will be conducted to unpack context-mechanism-outcome configurations of the effect of stigma reduction intervention strategies on HIV test uptake. Based on a scoping review, we developed a preliminary theoretical framework outlining a potential mechanism of how the intervention strategies influence HIV test uptake. Our realist synthesis will be used to refine the preliminary theoretical framework to better reflect mechanisms that are supported by existing evidence. Journal articles and grey literature will be searched following a purposeful sampling strategy. Data will be extracted and tested against the preliminary theoretical framework. Data synthesis and analysis will be performed in five steps: organizing extracted data into evidence tables, theming, formulating chains of inference from the identified themes, linking the chains of inference and developing generative mechanisms, and refining the framework. This will be the first realist review that offers both a quantitative and a qualitative exploration of the available evidence to develop and propose a theoretical framework that explains why and how HIV stigma reduction intervention strategies influence HIV test uptake in low- and middle-income countries. Our theoretical framework is meant to provide guidance to program managers on identifying the most effective stigma reduction intervention strategies to increase HIV test uptake. We also include advice on how to effectively implement these strategies to reduce the rate of HIV transmission. PROSPERO CRD42015023687.

  10. Using Historical Precipitation, Temperature, and Runoff Observations to Evaluate Evaporation Formulations in Land Surface Models

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Mahanama, P. P.

    2012-01-01

    Key to translating soil moisture memory into subseasonal precipitation and air temperature forecast skill is a realistic treatment of evaporation in the forecast system used - in particular, a realistic treatment of how evaporation responds to variations in soil moisture. The inherent soil moisture-evaporation relationships used in today's land surface models (LSMs), however, arguably reflect little more than guesswork given the lack of evaporation and soil moisture data at the spatial scales represented by regional and global models. Here we present a new approach for evaluating this critical aspect of LSMs. Seasonally averaged precipitation is used as a proxy for seasonally-averaged soil moisture, and seasonally-averaged air temperature is used as a proxy for seasonally-averaged evaporation (e.g., more evaporative cooling leads to cooler temperatures) the relationship between historical precipitation and temperature measurements accordingly mimics in certain important ways nature's relationship between soil moisture and evaporation. Additional information on the relationship is gleaned from joint analysis of precipitation and streamflow measurements. An experimental framework that utilizes these ideas to guide the development of an improved soil moisture-evaporation relationship is described and demonstrated.

  11. Teaching and Learning in Preschool: Using Individually Appropriate Practices in Early Childhood Literacy Instruction.

    ERIC Educational Resources Information Center

    Venn, Elizabeth Claire; Jahn, Monica Dacy

    This book presents a preschool framework that integrates literacy activities into content area lessons while embedding instruction within adult-child social interactions and realistic, playful activities tailored to each child's individual needs. Chapter 1 of the book delineates the theory and rationale behind the framework, and outlines essential…

  12. Comparing the Performance of Indoor Localization Systems through the EvAAL Framework.

    PubMed

    Potortì, Francesco; Park, Sangjoon; Jiménez Ruiz, Antonio Ramón; Barsocchi, Paolo; Girolami, Michele; Crivello, Antonino; Lee, So Yeon; Lim, Jae Hyun; Torres-Sospedra, Joaquín; Seco, Fernando; Montoliu, Raul; Mendoza-Silva, Germán Martin; Pérez Rubio, Maria Del Carmen; Losada-Gutiérrez, Cristina; Espinosa, Felipe; Macias-Guarasa, Javier

    2017-10-13

    In recent years, indoor localization systems have been the object of significant research activity and of growing interest for their great expected social impact and their impressive business potential. Application areas include tracking and navigation, activity monitoring, personalized advertising, Active and Assisted Living (AAL), traceability, Internet of Things (IoT) networks, and Home-land Security. In spite of the numerous research advances and the great industrial interest, no canned solutions have yet been defined. The diversity and heterogeneity of applications, scenarios, sensor and user requirements, make it difficult to create uniform solutions. From that diverse reality, a main problem is derived that consists in the lack of a consensus both in terms of the metrics and the procedures used to measure the performance of the different indoor localization and navigation proposals. This paper introduces the general lines of the EvAAL benchmarking framework, which is aimed at a fair comparison of indoor positioning systems through a challenging competition under complex, realistic conditions. To evaluate the framework capabilities, we show how it was used in the 2016 Indoor Positioning and Indoor Navigation (IPIN) Competition. The 2016 IPIN competition considered three different scenario dimensions, with a variety of use cases: (1) pedestrian versus robotic navigation, (2) smartphones versus custom hardware usage and (3) real-time positioning versus off-line post-processing. A total of four competition tracks were evaluated under the same EvAAL benchmark framework in order to validate its potential to become a standard for evaluating indoor localization solutions. The experience gained during the competition and feedback from track organizers and competitors showed that the EvAAL framework is flexible enough to successfully fit the very different tracks and appears adequate to compare indoor positioning systems.

  13. Comparing the Performance of Indoor Localization Systems through the EvAAL Framework

    PubMed Central

    2017-01-01

    In recent years, indoor localization systems have been the object of significant research activity and of growing interest for their great expected social impact and their impressive business potential. Application areas include tracking and navigation, activity monitoring, personalized advertising, Active and Assisted Living (AAL), traceability, Internet of Things (IoT) networks, and Home-land Security. In spite of the numerous research advances and the great industrial interest, no canned solutions have yet been defined. The diversity and heterogeneity of applications, scenarios, sensor and user requirements, make it difficult to create uniform solutions. From that diverse reality, a main problem is derived that consists in the lack of a consensus both in terms of the metrics and the procedures used to measure the performance of the different indoor localization and navigation proposals. This paper introduces the general lines of the EvAAL benchmarking framework, which is aimed at a fair comparison of indoor positioning systems through a challenging competition under complex, realistic conditions. To evaluate the framework capabilities, we show how it was used in the 2016 Indoor Positioning and Indoor Navigation (IPIN) Competition. The 2016 IPIN competition considered three different scenario dimensions, with a variety of use cases: (1) pedestrian versus robotic navigation, (2) smartphones versus custom hardware usage and (3) real-time positioning versus off-line post-processing. A total of four competition tracks were evaluated under the same EvAAL benchmark framework in order to validate its potential to become a standard for evaluating indoor localization solutions. The experience gained during the competition and feedback from track organizers and competitors showed that the EvAAL framework is flexible enough to successfully fit the very different tracks and appears adequate to compare indoor positioning systems. PMID:29027948

  14. Implementing health research through academic and clinical partnerships: a realistic evaluation of the Collaborations for Leadership in Applied Health Research and Care (CLAHRC).

    PubMed

    Rycroft-Malone, Jo; Wilkinson, Joyce E; Burton, Christopher R; Andrews, Gavin; Ariss, Steven; Baker, Richard; Dopson, Sue; Graham, Ian; Harvey, Gill; Martin, Graham; McCormack, Brendan G; Staniszewska, Sophie; Thompson, Carl

    2011-07-19

    The English National Health Service has made a major investment in nine partnerships between higher education institutions and local health services called Collaborations for Leadership in Applied Health Research and Care (CLAHRC). They have been funded to increase capacity and capability to produce and implement research through sustained interactions between academics and health services. CLAHRCs provide a natural 'test bed' for exploring questions about research implementation within a partnership model of delivery. This protocol describes an externally funded evaluation that focuses on implementation mechanisms and processes within three CLAHRCs. It seeks to uncover what works, for whom, how, and in what circumstances. This study is a longitudinal three-phase, multi-method realistic evaluation, which deliberately aims to explore the boundaries around knowledge use in context. The evaluation funder wishes to see it conducted for the process of learning, not for judging performance. The study is underpinned by a conceptual framework that combines the Promoting Action on Research Implementation in Health Services and Knowledge to Action frameworks to reflect the complexities of implementation. Three participating CLARHCS will provide in-depth comparative case studies of research implementation using multiple data collection methods including interviews, observation, documents, and publicly available data to test and refine hypotheses over four rounds of data collection. We will test the wider applicability of emerging findings with a wider community using an interpretative forum. The idea that collaboration between academics and services might lead to more applicable health research that is actually used in practice is theoretically and intuitively appealing; however the evidence for it is limited. Our evaluation is designed to capture the processes and impacts of collaborative approaches for implementing research, and therefore should contribute to the evidence base about an increasingly popular (e.g., Mode two, integrated knowledge transfer, interactive research), but poorly understood approach to knowledge translation. Additionally we hope to develop approaches for evaluating implementation processes and impacts particularly with respect to integrated stakeholder involvement.

  15. Implementing health research through academic and clinical partnerships: a realistic evaluation of the Collaborations for Leadership in Applied Health Research and Care (CLAHRC)

    PubMed Central

    2011-01-01

    Background The English National Health Service has made a major investment in nine partnerships between higher education institutions and local health services called Collaborations for Leadership in Applied Health Research and Care (CLAHRC). They have been funded to increase capacity and capability to produce and implement research through sustained interactions between academics and health services. CLAHRCs provide a natural 'test bed' for exploring questions about research implementation within a partnership model of delivery. This protocol describes an externally funded evaluation that focuses on implementation mechanisms and processes within three CLAHRCs. It seeks to uncover what works, for whom, how, and in what circumstances. Design and methods This study is a longitudinal three-phase, multi-method realistic evaluation, which deliberately aims to explore the boundaries around knowledge use in context. The evaluation funder wishes to see it conducted for the process of learning, not for judging performance. The study is underpinned by a conceptual framework that combines the Promoting Action on Research Implementation in Health Services and Knowledge to Action frameworks to reflect the complexities of implementation. Three participating CLARHCS will provide in-depth comparative case studies of research implementation using multiple data collection methods including interviews, observation, documents, and publicly available data to test and refine hypotheses over four rounds of data collection. We will test the wider applicability of emerging findings with a wider community using an interpretative forum. Discussion The idea that collaboration between academics and services might lead to more applicable health research that is actually used in practice is theoretically and intuitively appealing; however the evidence for it is limited. Our evaluation is designed to capture the processes and impacts of collaborative approaches for implementing research, and therefore should contribute to the evidence base about an increasingly popular (e.g., Mode two, integrated knowledge transfer, interactive research), but poorly understood approach to knowledge translation. Additionally we hope to develop approaches for evaluating implementation processes and impacts particularly with respect to integrated stakeholder involvement. PMID:21771329

  16. A framework for improving the quality of health information on the world-wide-web and bettering public (e-)health: the MedCERTAIN approach.

    PubMed

    Eysenbach, G; Köhler, C; Yihune, G; Lampe, K; Cross, P; Brickley, D

    2001-01-01

    There has been considerable debate about the variable quality of health information on the world-wide-web and its impact on public health. While central authorities to regulate, control, censor, or centrally approve information, in-formation providers or websites are neither realistic nor desirable, public health professionals are interested in making systems available that direct patient streams to the best available information sources. National governments and medical societies have also recognized their responsibility to help users to identify "good quality" information sources. But what constitutes good quality, and how can such a system be implemented in a decentralized and democratic manner? This paper presents a model which combines aspects of consumer education, encouragement of best practices among information providers, self-labeling and external evaluations. The model is currently being implemented and evaluated in the MedCERTAIN project, funded by the European Union under the Action Plan for Safer Use of the Internet. The aim is to develop a technical and organisational infrastructure for a pilot system that allows consumers to access metainformation about web-sites and health information providers, including disclosure information from health providers and opinions of external evaluators. The paper explains the general conceptual framework of the model and presents preliminary experiences including results from an expert consensus meeting, where the framework was discussed.

  17. Driving simulator validation of driver behavior with limited safe vantage points for data collection in work zones.

    PubMed

    Bham, Ghulam H; Leu, Ming C; Vallati, Manoj; Mathur, Durga R

    2014-06-01

    This study is aimed at validating a driving simulator (DS) for the study of driver behavior in work zones. A validation study requires field data collection. For studies conducted in highway work zones, the availability of safe vantage points for data collection at critical locations can be a significant challenge. A validation framework is therefore proposed in this paper, demonstrated using a fixed-based DS that addresses the issue by using a global positioning system (GPS). The validation of the DS was conducted using objective and subjective evaluations. The objective validation was divided into qualitative and quantitative evaluations. The DS was validated by comparing the results of simulation with the field data, which were collected using a GPS along the highway and video recordings at specific locations in a work zone. The constructed work zone scenario in the DS was subjectively evaluated with 46 participants. The objective evaluation established the absolute and relative validity of the DS. The mean speeds from the DS data showed excellent agreement with the field data. The subjective evaluation indicated realistic driving experience by the participants. The use of GPS showed that continuous data collected along the highway can overcome the challenges of unavailability of safe vantage points especially at critical locations. Further, a validated DS can be used for examining driver behavior in complex situations by replicating realistic scenarios. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Python scripting in the nengo simulator.

    PubMed

    Stewart, Terrence C; Tripp, Bryan; Eliasmith, Chris

    2009-01-01

    Nengo (http://nengo.ca) is an open-source neural simulator that has been greatly enhanced by the recent addition of a Python script interface. Nengo provides a wide range of features that are useful for physiological simulations, including unique features that facilitate development of population-coding models using the neural engineering framework (NEF). This framework uses information theory, signal processing, and control theory to formalize the development of large-scale neural circuit models. Notably, it can also be used to determine the synaptic weights that underlie observed network dynamics and transformations of represented variables. Nengo provides rich NEF support, and includes customizable models of spike generation, muscle dynamics, synaptic plasticity, and synaptic integration, as well as an intuitive graphical user interface. All aspects of Nengo models are accessible via the Python interface, allowing for programmatic creation of models, inspection and modification of neural parameters, and automation of model evaluation. Since Nengo combines Python and Java, it can also be integrated with any existing Java or 100% Python code libraries. Current work includes connecting neural models in Nengo with existing symbolic cognitive models, creating hybrid systems that combine detailed neural models of specific brain regions with higher-level models of remaining brain areas. Such hybrid models can provide (1) more realistic boundary conditions for the neural components, and (2) more realistic sub-components for the larger cognitive models.

  19. Python Scripting in the Nengo Simulator

    PubMed Central

    Stewart, Terrence C.; Tripp, Bryan; Eliasmith, Chris

    2008-01-01

    Nengo (http://nengo.ca) is an open-source neural simulator that has been greatly enhanced by the recent addition of a Python script interface. Nengo provides a wide range of features that are useful for physiological simulations, including unique features that facilitate development of population-coding models using the neural engineering framework (NEF). This framework uses information theory, signal processing, and control theory to formalize the development of large-scale neural circuit models. Notably, it can also be used to determine the synaptic weights that underlie observed network dynamics and transformations of represented variables. Nengo provides rich NEF support, and includes customizable models of spike generation, muscle dynamics, synaptic plasticity, and synaptic integration, as well as an intuitive graphical user interface. All aspects of Nengo models are accessible via the Python interface, allowing for programmatic creation of models, inspection and modification of neural parameters, and automation of model evaluation. Since Nengo combines Python and Java, it can also be integrated with any existing Java or 100% Python code libraries. Current work includes connecting neural models in Nengo with existing symbolic cognitive models, creating hybrid systems that combine detailed neural models of specific brain regions with higher-level models of remaining brain areas. Such hybrid models can provide (1) more realistic boundary conditions for the neural components, and (2) more realistic sub-components for the larger cognitive models. PMID:19352442

  20. Analysing an Audit Cycle: A Critical Realist Account

    ERIC Educational Resources Information Center

    Boughey, Chrissie; McKenna, Sioux

    2017-01-01

    This paper reports on the use of a framework developed from Bhaskar's critical realism and Archer's social realism to analyse teaching- and learning-related data produced as a result of the first cycle of institutional audits in the South African higher education system. The use of the framework allows us to see what this cycle of audits did…

  1. Sensing, Measuring and Modelling the Mechanical Properties of Sandstone

    NASA Astrophysics Data System (ADS)

    Antony, S. J.; Olugbenga, A.; Ozerkan, N. G.

    2018-02-01

    We present a hybrid framework for simulating the strength and dilation characteristics of sandstone. Where possible, the grain-scale properties of sandstone are evaluated experimentally in detail. Also, using photo-stress analysis, we sense the deviator stress (/strain) distribution at the micro-scale and its components along the orthogonal directions on the surface of a V-notch sandstone sample under mechanical loading. Based on this measurement and applying a grain-scale model, the optical anisotropy index K 0 is inferred at the grain scale. This correlated well with the grain contact stiffness ratio K evaluated using ultrasound sensors independently. Thereafter, in addition to other experimentally characterised structural and grain-scale properties of sandstone, K is fed as an input into the discrete element modelling of fracture strength and dilation of the sandstone samples. Physical bulk-scale experiments are also conducted to evaluate the load-displacement relation, dilation and bulk fracture strength characteristics of sandstone samples under compression and shear. A good level of agreement is obtained between the results of the simulations and experiments. The current generic framework could be applied to understand the internal and bulk mechanical properties of such complex opaque and heterogeneous materials more realistically in future.

  2. Improving the accuracy of S02 column densities and emission rates obtained from upward-looking UV-spectroscopic measurements of volcanic plumes by taking realistic radiative transfer into account

    USGS Publications Warehouse

    Kern, Christoph; Deutschmann, Tim; Werner, Cynthia; Sutton, A. Jeff; Elias, Tamar; Kelly, Peter J.

    2012-01-01

    Sulfur dioxide (SO2) is monitored using ultraviolet (UV) absorption spectroscopy at numerous volcanoes around the world due to its importance as a measure of volcanic activity and a tracer for other gaseous species. Recent studies have shown that failure to take realistic radiative transfer into account during the spectral retrieval of the collected data often leads to large errors in the calculated emission rates. Here, the framework for a new evaluation method which couples a radiative transfer model to the spectral retrieval is described. In it, absorption spectra are simulated, and atmospheric parameters are iteratively updated in the model until a best match to the measurement data is achieved. The evaluation algorithm is applied to two example Differential Optical Absorption Spectroscopy (DOAS) measurements conducted at Kilauea volcano (Hawaii). The resulting emission rates were 20 and 90% higher than those obtained with a conventional DOAS retrieval performed between 305 and 315 nm, respectively, depending on the different SO2 and aerosol loads present in the volcanic plume. The internal consistency of the method was validated by measuring and modeling SO2 absorption features in a separate wavelength region around 375 nm and comparing the results. Although additional information about the measurement geometry and atmospheric conditions is needed in addition to the acquired spectral data, this method for the first time provides a means of taking realistic three-dimensional radiative transfer into account when analyzing UV-spectral absorption measurements of volcanic SO2 plumes.

  3. Patient-Specific Simulation of Cardiac Blood Flow From High-Resolution Computed Tomography.

    PubMed

    Lantz, Jonas; Henriksson, Lilian; Persson, Anders; Karlsson, Matts; Ebbers, Tino

    2016-12-01

    Cardiac hemodynamics can be computed from medical imaging data, and results could potentially aid in cardiac diagnosis and treatment optimization. However, simulations are often based on simplified geometries, ignoring features such as papillary muscles and trabeculae due to their complex shape, limitations in image acquisitions, and challenges in computational modeling. This severely hampers the use of computational fluid dynamics in clinical practice. The overall aim of this study was to develop a novel numerical framework that incorporated these geometrical features. The model included the left atrium, ventricle, ascending aorta, and heart valves. The framework used image registration to obtain patient-specific wall motion, automatic remeshing to handle topological changes due to the complex trabeculae motion, and a fast interpolation routine to obtain intermediate meshes during the simulations. Velocity fields and residence time were evaluated, and they indicated that papillary muscles and trabeculae strongly interacted with the blood, which could not be observed in a simplified model. The framework resulted in a model with outstanding geometrical detail, demonstrating the feasibility as well as the importance of a framework that is capable of simulating blood flow in physiologically realistic hearts.

  4. Safety risk assessment using analytic hierarchy process (AHP) during planning and budgeting of construction projects.

    PubMed

    Aminbakhsh, Saman; Gunduz, Murat; Sonmez, Rifat

    2013-09-01

    The inherent and unique risks on construction projects quite often present key challenges to contractors. Health and safety risks are among the most significant risks in construction projects since the construction industry is characterized by a relatively high injury and death rate compared to other industries. In construction project management, safety risk assessment is an important step toward identifying potential hazards and evaluating the risks associated with the hazards. Adequate prioritization of safety risks during risk assessment is crucial for planning, budgeting, and management of safety related risks. In this paper, a safety risk assessment framework is presented based on the theory of cost of safety (COS) model and the analytic hierarchy process (AHP). The main contribution of the proposed framework is that it presents a robust method for prioritization of safety risks in construction projects to create a rational budget and to set realistic goals without compromising safety. The framework provides a decision tool for the decision makers to determine the adequate accident/injury prevention investments while considering the funding limits. The proposed safety risk framework is illustrated using a real-life construction project and the advantages and limitations of the framework are discussed. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  5. DyKOSMap: A framework for mapping adaptation between biomedical knowledge organization systems.

    PubMed

    Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal

    2015-06-01

    Knowledge Organization Systems (KOS) and their associated mappings play a central role in several decision support systems. However, by virtue of knowledge evolution, KOS entities are modified over time, impacting mappings and potentially turning them invalid. This requires semi-automatic methods to maintain such semantic correspondences up-to-date at KOS evolution time. We define a complete and original framework based on formal heuristics that drives the adaptation of KOS mappings. Our approach takes into account the definition of established mappings, the evolution of KOS and the possible changes that can be applied to mappings. This study experimentally evaluates the proposed heuristics and the entire framework on realistic case studies borrowed from the biomedical domain, using official mappings between several biomedical KOSs. We demonstrate the overall performance of the approach over biomedical datasets of different characteristics and sizes. Our findings reveal the effectiveness in terms of precision, recall and F-measure of the suggested heuristics and methods defining the framework to adapt mappings affected by KOS evolution. The obtained results contribute and improve the quality of mappings over time. The proposed framework can adapt mappings largely automatically, facilitating thus the maintenance task. The implemented algorithms and tools support and minimize the work of users in charge of KOS mapping maintenance. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Performance Evaluation of a SLA Negotiation Control Protocol for Grid Networks

    NASA Astrophysics Data System (ADS)

    Cergol, Igor; Mirchandani, Vinod; Verchere, Dominique

    A framework for an autonomous negotiation control protocol for service delivery is crucial to enable the support of heterogeneous service level agreements (SLAs) that will exist in distributed environments. We have first given a gist of our augmented service negotiation protocol to support distinct service elements. The augmentations also encompass related composition of the services and negotiation with several service providers simultaneously. All the incorporated augmentations will enable to consolidate the service negotiation operations for telecom networks, which are evolving towards Grid networks. Furthermore, our autonomous negotiation protocol is based on a distributed multi-agent framework to create an open market for Grid services. Second, we have concisely presented key simulation results of our work in progress. The results exhibit the usefulness of our negotiation protocol for realistic scenarios that involves different background traffic loading, message sizes and traffic flow asymmetry between background and negotiation traffics.

  7. A framework for risk assessment and decision-making strategies in dangerous good transportation.

    PubMed

    Fabiano, B; Currò, F; Palazzi, E; Pastorino, R

    2002-07-01

    The risk from dangerous goods transport by road and strategies for selecting road load/routes are faced in this paper, by developing an original site-oriented framework of general applicability at local level. A realistic evaluation of the frequency must take into account on one side inherent factors (e.g. tunnels, rail bridges, bend radii, slope, characteristics of neighborhood, etc.) on the other side factors correlated to the traffic conditions (e.g. dangerous goods trucks, etc.). Field data were collected on the selected highway, by systematic investigation, providing input data for a database reporting tendencies and intrinsic parameter/site-oriented statistics. The developed technique was applied to a pilot area, considering both the individual risk and societal risk and making reference to flammable and explosive scenarios. In this way, a risk assessment, sensitive to route features and population exposed, is proposed, so that the overall uncertainties in risk analysis can be lowered.

  8. Fuzzy Traffic Control with Vehicle-to-Everything Communication.

    PubMed

    Salman, Muntaser A; Ozdemir, Suat; Celebi, Fatih V

    2018-01-27

    Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR.

  9. Fuzzy Traffic Control with Vehicle-to-Everything Communication

    PubMed Central

    Ozdemir, Suat; Celebi, Fatih V.

    2018-01-01

    Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR. PMID:29382053

  10. A qualitative evaluation of the Scottish Staff and Associate Specialist Development Programme.

    PubMed

    Cleland, Jennifer; Burr, Jacqueline; Johnston, Peter

    2016-05-01

    The continued professional development of staff and associate specialist doctors in the UK was ill served prior to the introduction of the new staff and associate specialist doctor's contract in 2008. The aim of this study was to independently evaluate NHS Education for Scotland's approach to improving professional development for staff and associate specialist doctors, the staff and associate specialist Professional Development Fund. Semi-structured telephone interviews with key stakeholders, framed by a realistic approach to evaluate what works, for whom and in how and under what circumstances. An inductive and data-driven thematic analysis was carried out and then the realist framework was applied to the data. We interviewed 22 key stakeholders: staff and associate specialist doctors, staff and associate specialist educational advisors, programme architects and clinical directors, between end February and May 2014. The resultant data indicated five broad themes: organisational barriers to continued professional development for staff and associate specialist doctors, the purpose of funding, gains from funding, the need for better communication about the staff and associate specialist Programme Development Fund, and the interplay between individual and systems factors. The staff and associate specialist Programme Development Fund has changed the opportunities available to staff and associate specialist doctors in Scotland and, in that sense, has changed the context for this group - or at least those who have realised the opportunities. © The Author(s) 2016.

  11. Monitoring and evaluation of disaster response efforts undertaken by local health departments: a rapid realist review.

    PubMed

    Gossip, Kate; Gouda, Hebe; Lee, Yong Yi; Firth, Sonja; Bermejo, Raoul; Zeck, Willibald; Jimenez Soto, Eliana

    2017-06-29

    Local health departments are often at the forefront of a disaster response, attending to the immediate trauma inflicted by the disaster and also the long term health consequences. As the frequency and severity of disasters are projected to rise, monitoring and evaluation (M&E) efforts are critical to help local health departments consolidate past experiences and improve future response efforts. Local health departments often conduct M&E work post disaster, however, many of these efforts fail to improve response procedures. We undertook a rapid realist review (RRR) to examine why M&E efforts undertaken by local health departments do not always result in improved disaster response efforts. We aimed to complement existing frameworks by focusing on the most basic and pragmatic steps of a M&E cycle targeted towards continuous system improvements. For these purposes, we developed a theoretical framework that draws on the quality improvement literature to 'frame' the steps in the M&E cycle. This framework encompassed a M&E cycle involving three stages (i.e., document and assess, disseminate and implement) that must be sequentially completed to learn from past experiences and improve future disaster response efforts. We used this framework to guide our examination of the literature and to identify any context-mechanism-outcome (CMO) configurations which describe how M&E may be constrained or enabled at each stage of the M&E cycle. This RRR found a number of explanatory CMO configurations that provide valuable insights into some of the considerations that should be made when using M&E to improve future disaster response efforts. Firstly, to support the accurate documentation and assessment of a disaster response, local health departments should consider how they can: establish a culture of learning within health departments; use embedded training methods; or facilitate external partnerships. Secondly, to enhance the widespread dissemination of lessons learned and facilitate inter-agency learning, evaluation reports should use standardised formats and terminology. Lastly, to increase commitment to improvement processes, local health department leaders should possess positive leadership attributes and encourage shared decision making. This study is among the first to conduct a synthesis of the CMO configurations which facilitate or hinder M&E efforts aimed at improving future disaster responses. It makes a significant contribution to the disaster literature and provides an evidence base that can be used to provide pragmatic guidance for improving M&E efforts of local health departments. PROSPERO 2015: CRD42015023526 .

  12. Study protocol: realist evaluation of effectiveness and sustainability of a community health workers programme in improving maternal and child health in Nigeria.

    PubMed

    Mirzoev, Tolib; Etiaba, Enyi; Ebenso, Bassey; Uzochukwu, Benjamin; Manzano, Ana; Onwujekwe, Obinna; Huss, Reinhard; Ezumah, Nkoli; Hicks, Joseph P; Newell, James; Ensor, Timothy

    2016-06-07

    Achievement of improved maternal and child health (MCH) outcomes continues to be an issue of international priority, particularly for sub-Saharan African countries such as Nigeria. Evidence suggests that the use of Community Health Workers (CHWs) can be effective in broadening access to, and coverage of, health services and improving MCH outcomes in such countries. In this paper, we report the methodology for a 5-year study which aims to evaluate the context, processes, outcomes and longer-term sustainability of a Nigerian CHW scheme. Evaluation of complex interventions requires a comprehensive understanding of intervention context, mechanisms and outcomes. The multidisciplinary and mixed-method realist approach will facilitate such evaluation. A favourable policy environment within which the study is conducted will ensure the successful uptake of results into policy and practice. A realist evaluation provides an overall methodological framework for this multidisciplinary and mixed methods research, which will be undertaken in Anambra state. The study will draw upon health economics, social sciences and statistics. The study comprises three steps: (1) initial theory development; (2) theory validation and (3) theory refinement and development of lessons learned. Specific methods for data collection will include in-depth interviews and focus group discussions with purposefully identified key stakeholders (managers, service providers and service users), document reviews, analyses of quantitative data from the CHW programme and health information system, and a small-scale survey. The impact of the programme on key output and outcome indicators will be assessed through an interrupted time-series analysis (ITS) of monthly quantitative data from health information system and programme reports. Ethics approvals for this study were obtained from the University of Leeds and the University of Nigeria. This study will provide a timely and important contribution to health systems strengthening specifically within Anambra state in southeast Nigeria but also more widely across Nigeria. This paper should be of interest to researchers who are interested in adapting and applying robust methodologies for assessing complex health system interventions. The paper will also be useful to policymakers and practitioners who are interested in commissioning and engaging in such complex evaluations to inform policies and practices.

  13. Stroke localization and classification using microwave tomography with k-means clustering and support vector machine.

    PubMed

    Guo, Lei; Abbosh, Amin

    2018-05-01

    For any chance for stroke patients to survive, the stroke type should be classified to enable giving medication within a few hours of the onset of symptoms. In this paper, a microwave-based stroke localization and classification framework is proposed. It is based on microwave tomography, k-means clustering, and a support vector machine (SVM) method. The dielectric profile of the brain is first calculated using the Born iterative method, whereas the amplitude of the dielectric profile is then taken as the input to k-means clustering. The cluster is selected as the feature vector for constructing and testing the SVM. A database of MRI-derived realistic head phantoms at different signal-to-noise ratios is used in the classification procedure. The performance of the proposed framework is evaluated using the receiver operating characteristic (ROC) curve. The results based on a two-dimensional framework show that 88% classification accuracy, with a sensitivity of 91% and a specificity of 87%, can be achieved. Bioelectromagnetics. 39:312-324, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  14. Extending the psycho-historical framework to understand artistic production.

    PubMed

    Kozbelt, Aaron; Ostrofsky, Justin

    2013-04-01

    We discuss how the psycho-historical framework can be profitably applied to artistic production, facilitating a synthesis of perception-based and knowledge-based perspectives on realistic observational drawing. We note that artists' technical knowledge itself constitutes a major component of an artwork's historical context, and that links between artistic practice and psychological theory may yet yield conclusions in line with universalist perspectives.

  15. Evaluating a Modular Decision Support Application for Colorectal Cancer Screening

    PubMed Central

    Diiulio, Julie B.; Borders, Morgan R.; Sushereba, Christen E.; Saleem, Jason J.; Haverkamp, Donald; Imperiale, Thomas F.

    2017-01-01

    Summary Background There is a need for health information technology evaluation that goes beyond randomized controlled trials to include consideration of usability, cognition, feedback from representative users, and impact on efficiency, data quality, and clinical workflow. This article presents an evaluation illustrating one approach to this need using the Decision-Centered Design framework. Objective To evaluate, through a Decision-Centered Design framework, the ability of the Screening and Surveillance App to support primary care clinicians in tracking and managing colorectal cancer testing. Methods We leveraged two evaluation formats, online and in-person, to obtain feedback from a range primary care clinicians and obtain comparative data. Both the online and in-person evaluations used mock patient data to simulate challenging patient scenarios. Primary care clinicians responded to a series of colorectal cancer-related questions about each patient and made recommendations for screening. We collected data on performance, perceived workload, and usability. Key elements of Decision-Centered Design include evaluation in the context of realistic, challenging scenarios and measures designed to explore impact on cognitive performance. Results Comparison of means revealed increases in accuracy, efficiency, and usability and decreases in perceived mental effort and workload when using the Screening and Surveillance App. Conclusion The results speak to the benefits of using the Decision-Centered Design approach in the analysis, design, and evaluation of Health Information Technology. Furthermore, the Screening and Surveillance App shows promise for filling decision support gaps in current electronic health records. PMID:28197619

  16. Realism and resources: Towards more explanatory economic evaluation

    PubMed Central

    Anderson, Rob; Hardwick, Rebecca

    2016-01-01

    To be successfully and sustainably adopted, policy-makers, service managers and practitioners want public programmes to be affordable and cost-effective, as well as effective. While the realist evaluation question is often summarised as what works for whom, under what circumstances, we believe the approach can be as salient to answering questions about resource use, costs and cost-effectiveness – the traditional domain of economic evaluation methods. This paper first describes the key similarities and differences between economic evaluation and realist evaluation. It summarises what health economists see as the challenges of evaluating complex interventions, and their suggested solutions. We then use examples of programme theory from a recent realist review of shared care for chronic conditions to illustrate two ways in which realist evaluations might better capture the resource requirements and resource consequences of programmes, and thereby produce explanations of how they are linked to outcomes (i.e. explanations of cost-effectiveness). PMID:27478402

  17. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    NASA Technical Reports Server (NTRS)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  18. Towards photorealistic and immersive virtual-reality environments for simulated prosthetic vision: integrating recent breakthroughs in consumer hardware and software.

    PubMed

    Zapf, Marc P; Matteucci, Paul B; Lovell, Nigel H; Zheng, Steven; Suaning, Gregg J

    2014-01-01

    Simulated prosthetic vision (SPV) in normally sighted subjects is an established way of investigating the prospective efficacy of visual prosthesis designs in visually guided tasks such as mobility. To perform meaningful SPV mobility studies in computer-based environments, a credible representation of both the virtual scene to navigate and the experienced artificial vision has to be established. It is therefore prudent to make optimal use of existing hardware and software solutions when establishing a testing framework. The authors aimed at improving the realism and immersion of SPV by integrating state-of-the-art yet low-cost consumer technology. The feasibility of body motion tracking to control movement in photo-realistic virtual environments was evaluated in a pilot study. Five subjects were recruited and performed an obstacle avoidance and wayfinding task using either keyboard and mouse, gamepad or Kinect motion tracking. Walking speed and collisions were analyzed as basic measures for task performance. Kinect motion tracking resulted in lower performance as compared to classical input methods, yet results were more uniform across vision conditions. The chosen framework was successfully applied in a basic virtual task and is suited to realistically simulate real-world scenes under SPV in mobility research. Classical input peripherals remain a feasible and effective way of controlling the virtual movement. Motion tracking, despite its limitations and early state of implementation, is intuitive and can eliminate between-subject differences due to familiarity to established input methods.

  19. A realistic multimodal modeling approach for the evaluation of distributed source analysis: application to sLORETA.

    PubMed

    Cosandier-Rimélé, D; Ramantani, G; Zentner, J; Schulze-Bonhage, A; Dümpelmann, M

    2017-10-01

    Electrical source localization (ESL) deriving from scalp EEG and, in recent years, from intracranial EEG (iEEG), is an established method in epilepsy surgery workup. We aimed to validate the distributed ESL derived from scalp EEG and iEEG, particularly regarding the spatial extent of the source, using a realistic epileptic spike activity simulator. ESL was applied to the averaged scalp EEG and iEEG spikes of two patients with drug-resistant structural epilepsy. The ESL results for both patients were used to outline the location and extent of epileptic cortical patches, which served as the basis for designing a spatiotemporal source model. EEG signals for both modalities were then generated for different anatomic locations and spatial extents. ESL was subsequently performed on simulated signals with sLORETA, a commonly used distributed algorithm. ESL accuracy was quantitatively assessed for iEEG and scalp EEG. The source volume was overestimated by sLORETA at both EEG scales, with the error increasing with source size, particularly for iEEG. For larger sources, ESL accuracy drastically decreased, and reconstruction volumes shifted to the center of the head for iEEG, while remaining stable for scalp EEG. Overall, the mislocalization of the reconstructed source was more pronounced for iEEG. We present a novel multiscale framework for the evaluation of distributed ESL, based on realistic multiscale EEG simulations. Our findings support that reconstruction results for scalp EEG are often more accurate than for iEEG, owing to the superior 3D coverage of the head. Particularly the iEEG-derived reconstruction results for larger, widespread generators should be treated with caution.

  20. A realistic multimodal modeling approach for the evaluation of distributed source analysis: application to sLORETA

    NASA Astrophysics Data System (ADS)

    Cosandier-Rimélé, D.; Ramantani, G.; Zentner, J.; Schulze-Bonhage, A.; Dümpelmann, M.

    2017-10-01

    Objective. Electrical source localization (ESL) deriving from scalp EEG and, in recent years, from intracranial EEG (iEEG), is an established method in epilepsy surgery workup. We aimed to validate the distributed ESL derived from scalp EEG and iEEG, particularly regarding the spatial extent of the source, using a realistic epileptic spike activity simulator. Approach. ESL was applied to the averaged scalp EEG and iEEG spikes of two patients with drug-resistant structural epilepsy. The ESL results for both patients were used to outline the location and extent of epileptic cortical patches, which served as the basis for designing a spatiotemporal source model. EEG signals for both modalities were then generated for different anatomic locations and spatial extents. ESL was subsequently performed on simulated signals with sLORETA, a commonly used distributed algorithm. ESL accuracy was quantitatively assessed for iEEG and scalp EEG. Main results. The source volume was overestimated by sLORETA at both EEG scales, with the error increasing with source size, particularly for iEEG. For larger sources, ESL accuracy drastically decreased, and reconstruction volumes shifted to the center of the head for iEEG, while remaining stable for scalp EEG. Overall, the mislocalization of the reconstructed source was more pronounced for iEEG. Significance. We present a novel multiscale framework for the evaluation of distributed ESL, based on realistic multiscale EEG simulations. Our findings support that reconstruction results for scalp EEG are often more accurate than for iEEG, owing to the superior 3D coverage of the head. Particularly the iEEG-derived reconstruction results for larger, widespread generators should be treated with caution.

  1. A realist evaluation of a physical activity participation intervention for children and youth with disabilities: what works, for whom, in what circumstances, and how?

    PubMed

    Willis, C E; Reid, S; Elliott, C; Rosenberg, M; Nyquist, A; Jahnsen, R; Girdler, S

    2018-03-15

    The need to identify strategies that facilitate involvement in physical activity for children and youth with disabilities is recognised as an urgent priority. This study aimed to describe the association between context, mechanisms and outcome(s) of a participation-focused physical activity intervention to understand what works, in what conditions, and how. This study was designed as a realist evaluation. Participant recruitment occurred through purposive and theoretical sampling of children and parents participating in the Local Environment Model intervention at Beitostolen Healthsports Centre in Norway. Ethnographic methods comprising participant observation, interviews, and focus groups were employed over 15 weeks in the field. Data analysis was completed using the context-mechanism-outcome framework of realist evaluation. Context-mechanism-outcome connections were generated empirically from the data to create a model to indicate how the program activated mechanisms within the program context, to enable participation in physical activity. Thirty one children with a range of disabilities (mean age 12y 6 m (SD 2y 2 m); 18 males) and their parents (n = 44; 26 mothers and 18 fathers) participated in the study. Following data synthesis, a refined program theory comprising four context themes, five mechanisms, and six outcomes, were identified. The mechanisms (choice, fun, friends, specialised health professionals, and time) were activated in a context that was safe, social, learning-based and family-centred, to elicit outcomes across all levels of the International Classification of Functioning, Disability and Health. The interaction of mechanisms and context as a whole facilitated meaningful outcomes for children and youth with disabilities, and their parents. Whilst optimising participation in physical activity is a primary outcome of the Local Environment Model, the refined program theory suggests the participation-focused approach may act as a catalyst to promote a range of outcomes. Findings from this study may inform future interventions attempting to enable participation in physical activity for children and youth with disabilities.

  2. Institutional misfit and environmental change: A systems approach to address ocean acidification.

    PubMed

    Ekstrom, Julia A; Crona, Beatrice I

    2017-01-15

    Emerging environmental threats often lack sufficient governance to address the full extent of the problem. An example is ocean acidification which is a growing concern in fishing and aquaculture economies worldwide, but has remained a footnote in environmental policy at all governance levels. However, existing legal jurisdictions do account for some aspects of the system relating to ocean acidification and these may be leveraged to support adapting to and mitigating ocean acidification. We refine and apply a methodological framework that helps objectively evaluate governance, from a social-ecological systems perspective. We assess how well a set of extant US institutions fits with the social-ecological interactions pertinent to ocean acidification. The assessment points to measured legal gaps, for which we evaluate the government authorities most appropriate to help fill these gaps. The analysis is conducted on United State federal statutes and regulations. Results show quantitative improvement of institutional fit over time (2006 to 2013), but a substantial number of measured legal gaps persist especially around acknowledging local sources of acidification and adaptation strategies to deal with or avoid impacts. We demonstrate the utility of this framework to evaluate the governance surrounding any emerging environmental threat as a first step to guiding the development of jurisdictionally realistic solutions. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.

  4. Realistic nurse-led policy implementation, optimization and evaluation: novel methodological exemplar.

    PubMed

    Noyes, Jane; Lewis, Mary; Bennett, Virginia; Widdas, David; Brombley, Karen

    2014-01-01

    To report the first large-scale realistic nurse-led implementation, optimization and evaluation of a complex children's continuing-care policy. Health policies are increasingly complex, involve multiple Government departments and frequently fail to translate into better patient outcomes. Realist methods have not yet been adapted for policy implementation. Research methodology - Evaluation using theory-based realist methods for policy implementation. An expert group developed the policy and supporting tools. Implementation and evaluation design integrated diffusion of innovation theory with multiple case study and adapted realist principles. Practitioners in 12 English sites worked with Consultant Nurse implementers to manipulate the programme theory and logic of new decision-support tools and care pathway to optimize local implementation. Methods included key-stakeholder interviews, developing practical diffusion of innovation processes using key-opinion leaders and active facilitation strategies and a mini-community of practice. New and existing processes and outcomes were compared for 137 children during 2007-2008. Realist principles were successfully adapted to a shorter policy implementation and evaluation time frame. Important new implementation success factors included facilitated implementation that enabled 'real-time' manipulation of programme logic and local context to best-fit evolving theories of what worked; using local experiential opinion to change supporting tools to more realistically align with local context and what worked; and having sufficient existing local infrastructure to support implementation. Ten mechanisms explained implementation success and differences in outcomes between new and existing processes. Realistic policy implementation methods have advantages over top-down approaches, especially where clinical expertise is low and unlikely to diffuse innovations 'naturally' without facilitated implementation and local optimization. © 2013 John Wiley & Sons Ltd.

  5. Evaluating the implementation of a quality improvement process in General Practice using a realist evaluation framework.

    PubMed

    Moule, Pam; Clompus, Susan; Fieldhouse, Jon; Ellis-Jones, Julie; Barker, Jacqueline

    2018-05-25

    Underuse of anticoagulants in atrial fibrillation is known to increase the risk of stroke and is an international problem. The National Institute for Health Care and Excellence guidance CG180 seeks to reduce atrial fibrillation related strokes through prescriptions of Non-vitamin K antagonist Oral Anticoagulants. A quality improvement programme was established by the West of England Academic Health Science Network (West of England AHSN) to implement this guidance into General Practice. A realist evaluation identified whether the quality improvement programme worked, determining how and in what circumstances. Six General Practices in 1 region, became the case study sites. Quality improvement team, doctor, and pharmacist meetings within each of the General Practices were recorded at 3 stages: initial planning, review, and final. Additionally, 15 interviews conducted with the practice leads explored experiences of the quality improvement process. Observation and interview data were analysed and compared against the initial programme theory. The quality improvement resources available were used variably, with the training being valued by all. The initial programme theories were refined. In particular, local workload pressures and individual General Practitioner experiences and pre-conceived ideas were acknowledged. Where key motivators were in place, such as prior experience, the programme achieved optimal outcomes and secured a lasting quality improvement legacy. The employment of a quality improvement programme can deliver practice change and improvement legacy outcomes when particular mechanisms are employed and in contexts where there is a commitment to improve service. © 2018 John Wiley & Sons, Ltd.

  6. The difference between energy consumption and energy cost: Modelling energy tariff structures for water resource recovery facilities.

    PubMed

    Aymerich, I; Rieger, L; Sobhani, R; Rosso, D; Corominas, Ll

    2015-09-15

    The objective of this paper is to demonstrate the importance of incorporating more realistic energy cost models (based on current energy tariff structures) into existing water resource recovery facilities (WRRFs) process models when evaluating technologies and cost-saving control strategies. In this paper, we first introduce a systematic framework to model energy usage at WRRFs and a generalized structure to describe energy tariffs including the most common billing terms. Secondly, this paper introduces a detailed energy cost model based on a Spanish energy tariff structure coupled with a WRRF process model to evaluate several control strategies and provide insights into the selection of the contracted power structure. The results for a 1-year evaluation on a 115,000 population-equivalent WRRF showed monthly cost differences ranging from 7 to 30% when comparing the detailed energy cost model to an average energy price. The evaluation of different aeration control strategies also showed that using average energy prices and neglecting energy tariff structures may lead to biased conclusions when selecting operating strategies or comparing technologies or equipment. The proposed framework demonstrated that for cost minimization, control strategies should be paired with a specific optimal contracted power. Hence, the design of operational and control strategies must take into account the local energy tariff. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Development of IR imaging system simulator

    NASA Astrophysics Data System (ADS)

    Xiang, Xinglang; He, Guojing; Dong, Weike; Dong, Lu

    2017-02-01

    To overcome the disadvantages of the tradition semi-physical simulation and injection simulation equipment in the performance evaluation of the infrared imaging system (IRIS), a low-cost and reconfigurable IRIS simulator, which can simulate the realistic physical process of infrared imaging, is proposed to test and evaluate the performance of the IRIS. According to the theoretical simulation framework and the theoretical models of the IRIS, the architecture of the IRIS simulator is constructed. The 3D scenes are generated and the infrared atmospheric transmission effects are simulated using OGRE technology in real-time on the computer. The physical effects of the IRIS are classified as the signal response characteristic, modulation transfer characteristic and noise characteristic, and they are simulated on the single-board signal processing platform based on the core processor FPGA in real-time using high-speed parallel computation method.

  8. Toxicological Evaluation of Realistic Emission Source Aerosols (TERESA): Introduction and overview

    PubMed Central

    Godleski, John J.; Rohr, Annette C.; Kang, Choong M.; Diaz, Edgar A.; Ruiz, Pablo A.; Koutrakis, Petros

    2013-01-01

    Determining the health impacts of sources and components of fine particulate matter (PM2.5) is an important scientific goal. PM2.5 is a complex mixture of inorganic and organic constituents that are likely to differ in their potential to cause adverse health outcomes. The Toxicological Evaluation of Realistic Emissions of Source Aerosols (TERESA) study focused on two PM sources—coal-fired power plants and mobile sources—and sought to investigate the toxicological effects of exposure to emissions from these sources. The set of papers published here document the power plant experiments. TERESA attempted to delineate health effects of primary particles, secondary (aged) particles, and mixtures of these with common atmospheric constituents. TERESA involved withdrawal of emissions from the stacks of three coal-fired power plants in the United States. The emissions were aged and atmospherically transformed in a mobile laboratory simulating downwind power plant plume processing. Toxicological evaluations were carried out in laboratory rats exposed to different emission scenarios with extensive exposure characterization. The approach employed in TERESA was ambitious and innovative. Technical challenges included the development of stack sampling technology that prevented condensation of water vapor from the power plant exhaust during sampling and transfer, while minimizing losses of primary particles; development and optimization of a photochemical chamber to provide an aged aerosol for animal exposures; development and evaluation of a denuder system to remove excess gaseous components; and development of a mobile toxicology laboratory. This paper provides an overview of the conceptual framework, design, and methods employed in the study. PMID:21639692

  9. An empirical generative framework for computational modeling of language acquisition.

    PubMed

    Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-06-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.

  10. European Healthy Cities evaluation: conceptual framework and methodology.

    PubMed

    de Leeuw, Evelyne; Green, Geoff; Dyakova, Mariana; Spanswick, Lucy; Palmer, Nicola

    2015-06-01

    This paper presents the methodology, programme logic and conceptual framework that drove the evaluation of the Fifth Phase of the WHO European Healthy Cities Network. Towards the end of the phase, 99 cities were designated progressively through the life of the phase (2009-14). The paper establishes the values, systems and aspirations that these cities sign up for, as foundations for the selection of methodology. We assert that a realist synthesis methodology, driven by a wide range of qualitative and quantitative methods, is the most appropriate perspective to address the wide geopolitical, demographic, population and health diversities of these cities. The paper outlines the rationale for a structured multiple case study approach, the deployment of a comprehensive questionnaire, data mining through existing databases including Eurostat and analysis of management information generation tools used throughout the period. Response rates were considered extremely high for this type of research. Non-response analyses are described, which show that data are representative for cities across the spectrum of diversity. This paper provides a foundation for further analysis on specific areas of interest presented in this supplement. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. FPGA-Based Efficient Hardware/Software Co-Design for Industrial Systems with Consideration of Output Selection

    NASA Astrophysics Data System (ADS)

    Deliparaschos, Kyriakos M.; Michail, Konstantinos; Zolotas, Argyrios C.; Tzafestas, Spyros G.

    2016-05-01

    This work presents a field programmable gate array (FPGA)-based embedded software platform coupled with a software-based plant, forming a hardware-in-the-loop (HIL) that is used to validate a systematic sensor selection framework. The systematic sensor selection framework combines multi-objective optimization, linear-quadratic-Gaussian (LQG)-type control, and the nonlinear model of a maglev suspension. A robustness analysis of the closed-loop is followed (prior to implementation) supporting the appropriateness of the solution under parametric variation. The analysis also shows that quantization is robust under different controller gains. While the LQG controller is implemented on an FPGA, the physical process is realized in a high-level system modeling environment. FPGA technology enables rapid evaluation of the algorithms and test designs under realistic scenarios avoiding heavy time penalty associated with hardware description language (HDL) simulators. The HIL technique facilitates significant speed-up in the required execution time when compared to its software-based counterpart model.

  12. Key lessons for designing health literacy professional development courses.

    PubMed

    Naccarella, Lucio; Murphy, Bernice

    2018-02-01

    Health literacy courses for health professionals have emerged in response to health professionals' perceived lack of understanding of health literacy issues, and their failure to routinely adopt health literacy practices. Since 2013 in Victoria, Australia, the Centre for Culture, Ethnicity and Health has delivered an annual health literacy demonstration training course that it developed. Course development and delivery partners included HealthWest Partnership and cohealth. The courses are designed to develop the health literacy knowledge, skills and organisational capacity of the health and community services sector in the western metropolitan region of Melbourne. This study presents key learnings from evaluation data from three health literacy courses using Wenger's professional educational learning design framework. The framework has three educational learning architecture components (engagement, imagination and alignment) and four educational learning architecture dimensions (participation, emergent, local/global, identification). Participatory realist evaluation approaches and qualitative methods were used. The evaluations revealed that the health literacy courses are developing leadership in health literacy, building partnerships among course participants, developing health literacy workforce knowledge and skills, developing ways to use and apply health literacy resources and are serving as a catalyst for building organisational infrastructure. Although the courses were not explicitly developed or implemented using Wenger's educational learning design pedagogic features, the course structure (i.e. facilitation role of course coordinators, providing safe learning environments, encouraging small group work amongst participants, requiring participants to conduct mini-projects and sponsor organisation buy-in) provided opportunities for engagement, imagination and alignment. Wenger's educational learning design framework can inform the design of future key pedagogic features of health literacy courses.

  13. Protocol—the RAMESES II study: developing guidance and reporting standards for realist evaluation

    PubMed Central

    Greenhalgh, Trisha; Wong, Geoff; Jagosh, Justin; Greenhalgh, Joanne; Manzano, Ana; Westhorp, Gill; Pawson, Ray

    2015-01-01

    Introduction Realist evaluation is an increasingly popular methodology in health services research. For realist evaluations (RE) this project aims to: develop quality and reporting standards and training materials; build capacity for undertaking and critically evaluating them; produce resources and training materials for lay participants, and those seeking to involve them. Methods To achieve our aims, we will: (1) Establish management and governance infrastructure; (2) Recruit an interdisciplinary Delphi panel of 35 participants with diverse relevant experience of RE; (3) Summarise current literature and expert opinion on best practice in RE; (4) Run an online Delphi panel to generate and refine items for quality and reporting standards; (5) Capture ‘real world’ experiences and challenges of RE—for example, by providing ongoing support to realist evaluations, hosting the RAMESES JISCmail list on realist research, and feeding problems and insights from these into the deliberations of the Delphi panel; (6) Produce quality and reporting standards; (7) Collate examples of the learning and training needs of researchers, students, reviewers and lay members in relation to RE; (8) Develop, deliver and evaluate training materials for RE and deliver training workshops; and (9) Develop and evaluate information and resources for patients and other lay participants in RE (eg, draft template information sheets and model consent forms) and; (10) Disseminate training materials and other resources. Planned outputs: (1) Quality and reporting standards and training materials for RE. (2) Methodological support for RE. (3) Increase in capacity to support and evaluate RE. (4) Accessible, plain-English resources for patients and the public participating in RE. Discussion The realist evaluation is a relatively new approach to evaluation and its overall place in the is not yet fully established. As with all primary research approaches, guidance on quality assurance and uniform reporting is an important step towards improving quality and consistency. PMID:26238395

  14. R-EACTR: A Framework for Designing Realistic Cyber Warfare Exercises

    DTIC Science & Technology

    2017-09-11

    2.1 Environment 3 2.2 Adversary 4 2.3 Communications 4 2.4 Tactics 5 2.5 Roles 5 3 Case Study – Cyber Forge 11 7 3.1 Environment 7 3.2...realism into each aspect of the exercise, and a case study of one exercise where the framework was successfully employed. CMU/SEI-2017-TR-005...network, emulation, logging, reporting Supporting: computer network defense service provider (CNDSP), intelligence, reach-back, higher

  15. Operationalizing the Learning Health Care System in an Integrated Delivery System

    PubMed Central

    Psek, Wayne A.; Stametz, Rebecca A.; Bailey-Davis, Lisa D.; Davis, Daniel; Darer, Jonathan; Faucett, William A.; Henninger, Debra L.; Sellers, Dorothy C.; Gerrity, Gloria

    2015-01-01

    Introduction: The Learning Health Care System (LHCS) model seeks to utilize sophisticated technologies and competencies to integrate clinical operations, research and patient participation in order to continuously generate knowledge, improve care, and deliver value. Transitioning from concept to practical application of an LHCS presents many challenges but can yield opportunities for continuous improvement. There is limited literature and practical experience available in operationalizing the LHCS in the context of an integrated health system. At Geisinger Health System (GHS) a multi-stakeholder group is undertaking to enhance organizational learning and develop a plan for operationalizing the LHCS system-wide. We present a framework for operationalizing continuous learning across an integrated delivery system and lessons learned through the ongoing planning process. Framework: The framework focuses attention on nine key LHCS operational components: Data and Analytics; People and Partnerships; Patient and Family Engagement; Ethics and Oversight; Evaluation and Methodology; Funding; Organization; Prioritization; and Deliverables. Definitions, key elements and examples for each are presented. The framework is purposefully broad for application across different organizational contexts. Conclusion: A realistic assessment of the culture, resources and capabilities of the organization related to learning is critical to defining the scope of operationalization. Engaging patients in clinical care and discovery, including quality improvement and comparative effectiveness research, requires a defensible ethical framework that undergirds a system of strong but flexible oversight. Leadership support is imperative for advancement of the LHCS model. Findings from our ongoing work within the proposed framework may inform other organizations considering a transition to an LHCS. PMID:25992388

  16. Molecular simulation of the adsorption of methane in Engelhard titanosilicate frameworks.

    PubMed

    Pillai, Renjith S; Gomes, José R B; Jorge, Miguel

    2014-07-01

    Molecular simulations were carried out to elucidate the influence of structural heterogeneity and of the presence of extra-framework cations and water molecules on the adsorption of methane in Engelhard titanosilicates, ETS-10 and ETS-4. The simulations employed three different modeling approaches, (i) with fixed cations and water at their single crystal positions, (ii) with fixed cations and water at their optimized positions, and (iii) with mobile extra-framework cations and water molecules. Simulations employing the final two approaches provided a more realistic description of adsorption in these materials, and showed that at least some cations and water molecules are displaced from the crystallographic positions obtained from single crystal data. Upon methane adsorption in the case of ETS-10, the cations move to the large rings, while in the case of ETS-4, the water molecules and cations migrate to more available space in the larger 12-membered ring channels for better accommodation of the methane molecules. For ETS-4, we also considered adsorption in all possible pure polymorph structures and then combined these to provide an estimate of adsorption in a real ETS-4 sample. By comparing simulated adsorption isotherms to experimental data, we were able to show that both the mobility of extra-framework species and the structural heterogeneity should be taken into account for realistic predictions of adsorption in titanosilicate materials.

  17. Performance evaluation of DNA copy number segmentation methods.

    PubMed

    Pierre-Jean, Morgane; Rigaill, Guillem; Neuvial, Pierre

    2015-07-01

    A number of bioinformatic or biostatistical methods are available for analyzing DNA copy number profiles measured from microarray or sequencing technologies. In the absence of rich enough gold standard data sets, the performance of these methods is generally assessed using unrealistic simulation studies, or based on small real data analyses. To make an objective and reproducible performance assessment, we have designed and implemented a framework to generate realistic DNA copy number profiles of cancer samples with known truth. These profiles are generated by resampling publicly available SNP microarray data from genomic regions with known copy-number state. The original data have been extracted from dilutions series of tumor cell lines with matched blood samples at several concentrations. Therefore, the signal-to-noise ratio of the generated profiles can be controlled through the (known) percentage of tumor cells in the sample. This article describes this framework and its application to a comparison study between methods for segmenting DNA copy number profiles from SNP microarrays. This study indicates that no single method is uniformly better than all others. It also helps identifying pros and cons of the compared methods as a function of biologically informative parameters, such as the fraction of tumor cells in the sample and the proportion of heterozygous markers. This comparison study may be reproduced using the open source and cross-platform R package jointseg, which implements the proposed data generation and evaluation framework: http://r-forge.r-project.org/R/?group_id=1562. © The Author 2014. Published by Oxford University Press.

  18. Framework for Computer Assisted Instruction Courseware: A Case Study.

    ERIC Educational Resources Information Center

    Betlach, Judith A.

    1987-01-01

    Systematically investigates, defines, and organizes variables related to production of internally designed and implemented computer assisted instruction (CAI) courseware: special needs of users; costs; identification and definition of realistic training needs; CAI definition and design methodology; hardware and software requirements; and general…

  19. Inter-Individual Variability in High-Throughput Risk Prioritization of Environmental Chemicals (Sot)

    EPA Science Inventory

    We incorporate realistic human variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have...

  20. Lessons learned in using realist evaluation to assess maternal and newborn health programming in rural Bangladesh.

    PubMed

    Adams, Alayne; Sedalia, Saroj; McNab, Shanon; Sarker, Malabika

    2016-03-01

    Realist evaluation furnishes valuable insight to public health practitioners and policy makers about how and why interventions work or don't work. Moving beyond binary measures of success or failure, it provides a systematic approach to understanding what goes on in the 'Black Box' and how implementation decisions in real life contexts can affect intervention effectiveness. This paper reflects on an experience in applying the tenets of realist evaluation to identify optimal implementation strategies for scale-up of Maternal and Newborn Health (MNH) programmes in rural Bangladesh. Supported by UNICEF, the three MNH programmes under consideration employed different implementation models to deliver similar services and meet similar MNH goals. Programme targets included adoption of recommended antenatal, post-natal and essential newborn care practices; health systems strengthening through improved referral, accountability and administrative systems, and increased community knowledge. Drawing on focused examples from this research, seven steps for operationalizing the realist evaluation approach are offered, while emphasizing the need to iterate and innovate in terms of methods and analysis strategies. The paper concludes by reflecting on lessons learned in applying realist evaluation, and the unique insights it yields regarding implementation strategies for successful MNH programming. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D.; Bowman, Doug A.; Kopper, Regis

    Virtual reality training systems are commonly used in a variety of domains, and it is important to understand how the realism of a training simulation influences training effectiveness. The paper presents a framework for evaluating the effects of virtual reality fidelity based on an analysis of a simulation’s display, interaction, and scenario components. Following this framework, we conducted a controlled experiment to test the effects of fidelity on training effectiveness for a visual scanning task. The experiment varied the levels of field of view and visual realism during a training phase and then evaluated scanning performance with the simulator’s highestmore » level of fidelity. To assess scanning performance, we measured target detection and adherence to a prescribed strategy. The results show that both field of view and visual realism significantly affected target detection during training; higher field of view led to better performance and higher visual realism worsened performance. Additionally, the level of visual realism during training significantly affected learning of the prescribed visual scanning strategy, providing evidence that high visual realism was important for learning the technique. The results also demonstrate that task performance during training was not always a sufficient measure of mastery of an instructed technique. That is, if learning a prescribed strategy or skill is the goal of a training exercise, performance in a simulation may not be an appropriate indicator of effectiveness outside of training—evaluation in a more realistic setting may be necessary.« less

  2. How to Measure Costs and Benefits of eHealth Interventions: An Overview of Methods and Frameworks.

    PubMed

    Bergmo, Trine Strand

    2015-11-09

    Information on the costs and benefits of eHealth interventions is needed, not only to document value for money and to support decision making in the field, but also to form the basis for developing business models and to facilitate payment systems to support large-scale services. In the absence of solid evidence of its effects, key decision makers may doubt the effectiveness, which, in turn, limits investment in, and the long-term integration of, eHealth services. However, it is not realistic to conduct economic evaluations of all eHealth applications and services in all situations, so we need to be able to generalize from those we do conduct. This implies that we have to select the most appropriate methodology and data collection strategy in order to increase the transferability across evaluations. This paper aims to contribute to the understanding of how to apply economic evaluation methodology in the eHealth field. It provides a brief overview of basic health economics principles and frameworks and discusses some methodological issues and challenges in conducting cost-effectiveness analysis of eHealth interventions. Issues regarding the identification, measurement, and valuation of costs and benefits are outlined. Furthermore, this work describes the established techniques of combining costs and benefits, presents the decision rules for identifying the preferred option, and outlines approaches to data collection strategies. Issues related to transferability and complexity are also discussed.

  3. Evaluation of gravimetric techniques to estimate the microvascular filtration coefficient

    PubMed Central

    Dongaonkar, R. M.; Laine, G. A.; Stewart, R. H.

    2011-01-01

    Microvascular permeability to water is characterized by the microvascular filtration coefficient (Kf). Conventional gravimetric techniques to estimate Kf rely on data obtained from either transient or steady-state increases in organ weight in response to increases in microvascular pressure. Both techniques result in considerably different estimates and neither account for interstitial fluid storage and lymphatic return. We therefore developed a theoretical framework to evaluate Kf estimation techniques by 1) comparing conventional techniques to a novel technique that includes effects of interstitial fluid storage and lymphatic return, 2) evaluating the ability of conventional techniques to reproduce Kf from simulated gravimetric data generated by a realistic interstitial fluid balance model, 3) analyzing new data collected from rat intestine, and 4) analyzing previously reported data. These approaches revealed that the steady-state gravimetric technique yields estimates that are not directly related to Kf and are in some cases directly proportional to interstitial compliance. However, the transient gravimetric technique yields accurate estimates in some organs, because the typical experimental duration minimizes the effects of interstitial fluid storage and lymphatic return. Furthermore, our analytical framework reveals that the supposed requirement of tying off all draining lymphatic vessels for the transient technique is unnecessary. Finally, our numerical simulations indicate that our comprehensive technique accurately reproduces the value of Kf in all organs, is not confounded by interstitial storage and lymphatic return, and provides corroboration of the estimate from the transient technique. PMID:21346245

  4. Initial Demonstration of the Real-Time Safety Monitoring Framework for the National Airspace System Using Flight Data

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Daigle, Matthew; Goebel, Kai; Spirkovska, Lilly; Sankararaman, Shankar; Ossenfort, John; Kulkarni, Chetan; McDermott, William; Poll, Scott

    2016-01-01

    As new operational paradigms and additional aircraft are being introduced into the National Airspace System (NAS), maintaining safety in such a rapidly growing environment becomes more challenging. It is therefore desirable to have an automated framework to provide an overview of the current safety of the airspace at different levels of granularity, as well an understanding of how the state of the safety will evolve into the future given the anticipated flight plans, weather forecast, predicted health of assets in the airspace, and so on. Towards this end, as part of our earlier work, we formulated the Real-Time Safety Monitoring (RTSM) framework for monitoring and predicting the state of safety and to predict unsafe events. In our previous work, the RTSM framework was demonstrated in simulation on three different constructed scenarios. In this paper, we further develop the framework and demonstrate it on real flight data from multiple data sources. Specifically, the flight data is obtained through the Shadow Mode Assessment using Realistic Technologies for the National Airspace System (SMART-NAS) Testbed that serves as a central point of collection, integration, and access of information from these different data sources. By testing and evaluating using real-world scenarios, we may accelerate the acceptance of the RTSM framework towards deployment. In this paper we demonstrate the framework's capability to not only estimate the state of safety in the NAS, but predict the time and location of unsafe events such as a loss of separation between two aircraft, or an aircraft encountering convective weather. The experimental results highlight the capability of the approach, and the kind of information that can be provided to operators to improve their situational awareness in the context of safety.

  5. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    PubMed Central

    2012-01-01

    Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878

  6. ARCH: Adaptive recurrent-convolutional hybrid networks for long-term action recognition

    PubMed Central

    Xin, Miao; Zhang, Hong; Wang, Helong; Sun, Mingui; Yuan, Ding

    2017-01-01

    Recognition of human actions from digital video is a challenging task due to complex interfering factors in uncontrolled realistic environments. In this paper, we propose a learning framework using static, dynamic and sequential mixed features to solve three fundamental problems: spatial domain variation, temporal domain polytrope, and intra- and inter-class diversities. Utilizing a cognitive-based data reduction method and a hybrid “network upon networks” architecture, we extract human action representations which are robust against spatial and temporal interferences and adaptive to variations in both action speed and duration. We evaluated our method on the UCF101 and other three challenging datasets. Our results demonstrated a superior performance of the proposed algorithm in human action recognition. PMID:29290647

  7. Realist theory construction for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Kemp, Lynn A; Jalaludin, Bin B

    2016-01-01

    We have recently described a protocol for a study that aims to build a theory of neighbourhood context and postnatal depression. That protocol proposed a critical realist Explanatory Theory Building Method comprising of an: (1) emergent phase, (2) construction phase, and (3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design was described. The protocol also described in detail the Theory Construction Phase which will be presented here. The Theory Construction Phase will include: (1) defining stratified levels; (2) analytic resolution; (3) abductive reasoning; (4) comparative analysis (triangulation); (5) retroduction; (6) postulate and proposition development; (7) comparison and assessment of theories; and (8) conceptual frameworks and model development. The stratified levels of analysis in this study were predominantly social and psychological. The abductive analysis used the theoretical frames of: Stress Process; Social Isolation; Social Exclusion; Social Services; Social Capital, Acculturation Theory and Global-economic level mechanisms. Realist propositions are presented for each analysis of triangulated data. Inference to best explanation is used to assess and compare theories. A conceptual framework of maternal depression, stress and context is presented that includes examples of mechanisms at psychological, social, cultural and global-economic levels. Stress was identified as a necessary mechanism that has the tendency to cause several outcomes including depression, anxiety, and health harming behaviours. The conceptual framework subsequently included conditional mechanisms identified through the retroduction including the stressors of isolation and expectations and buffers of social support and trust. The meta-theory of critical realism is used here to generate and construct social epidemiological theory using stratified ontology and both abductive and retroductive analysis. The findings will be applied to the development of a middle range theory and subsequent programme theory for local perinatal child and family interventions.

  8. A service-oriented distributed semantic mediator: integrating multiscale biomedical information.

    PubMed

    Mora, Oscar; Engelbrecht, Gerhard; Bisbal, Jesus

    2012-11-01

    Biomedical research continuously generates large amounts of heterogeneous and multimodal data spread over multiple data sources. These data, if appropriately shared and exploited, could dramatically improve the research practice itself, and ultimately the quality of health care delivered. This paper presents DISMED (DIstributed Semantic MEDiator), an open source semantic mediator that provides a unified view of a federated environment of multiscale biomedical data sources. DISMED is a Web-based software application to query and retrieve information distributed over a set of registered data sources, using semantic technologies. It also offers a userfriendly interface specifically designed to simplify the usage of these technologies by non-expert users. Although the architecture of the software mediator is generic and domain independent, in the context of this paper, DISMED has been evaluated for managing biomedical environments and facilitating research with respect to the handling of scientific data distributed in multiple heterogeneous data sources. As part of this contribution, a quantitative evaluation framework has been developed. It consist of a benchmarking scenario and the definition of five realistic use-cases. This framework, created entirely with public datasets, has been used to compare the performance of DISMED against other available mediators. It is also available to the scientific community in order to evaluate progress in the domain of semantic mediation, in a systematic and comparable manner. The results show an average improvement in the execution time by DISMED of 55% compared to the second best alternative in four out of the five use-cases of the experimental evaluation.

  9. Unified Bayesian Estimator of EEG Reference at Infinity: rREST (Regularized Reference Electrode Standardization Technique).

    PubMed

    Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A

    2018-01-01

    The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs-with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the "oracle" choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance.

  10. Realistic Expectations for Rock Identification.

    ERIC Educational Resources Information Center

    Westerback, Mary Elizabeth; Azer, Nazmy

    1991-01-01

    Presents a rock classification scheme for use by beginning students. The scheme is based on rock textures (glassy, crystalline, clastic, and organic framework) and observable structures (vesicles and graded bedding). Discusses problems in other rock classification schemes which may produce confusion, misidentification, and anxiety. (10 references)…

  11. Automotive Technology. Career Education Guide.

    ERIC Educational Resources Information Center

    Dependents Schools (DOD), Washington, DC. European Area.

    The curriculum guide is designed to provide students with realistic training in automotive technology theory and practice within the secondary educational framework and to prepare them for entry into an occupation or continuing postsecondary education. The learning modules are grouped into three areas: small engines, automotive technology, and…

  12. The Umbra Simulation and Integration Framework Applied to Emergency Response Training

    NASA Technical Reports Server (NTRS)

    Hamilton, Paul Lawrence; Britain, Robert

    2010-01-01

    The Mine Emergency Response Interactive Training Simulation (MERITS) is intended to prepare personnel to manage an emergency in an underground coal mine. The creation of an effective training environment required realistic emergent behavior in response to simulation events and trainee interventions, exploratory modification of miner behavior rules, realistic physics, and incorporation of legacy code. It also required the ability to add rich media to the simulation without conflicting with normal desktop security settings. Our Umbra Simulation and Integration Framework facilitated agent-based modeling of miners and rescuers and made it possible to work with subject matter experts to quickly adjust behavior through script editing, rather than through lengthy programming and recompilation. Integration of Umbra code with the WebKit browser engine allowed the use of JavaScript-enabled local web pages for media support. This project greatly extended the capabilities of Umbra in support of training simulations and has implications for simulations that combine human behavior, physics, and rich media.

  13. Ca-Pri a Cellular Automata Phenomenological Research Investigation: Simulation Results

    NASA Astrophysics Data System (ADS)

    Iannone, G.; Troisi, A.

    2013-05-01

    Following the introduction of a phenomenological cellular automata (CA) model capable to reproduce city growth and urban sprawl, we develop a toy model simulation considering a realistic framework. The main characteristic of our approach is an evolution algorithm based on inhabitants preferences. The control of grown cells is obtained by means of suitable functions which depend on the initial condition of the simulation. New born urban settlements are achieved by means of a logistic evolution of the urban pattern while urban sprawl is controlled by means of the population evolution function. In order to compare model results with a realistic urban framework we have considered, as the area of study, the island of Capri (Italy) in the Mediterranean Sea. Two different phases of the urban evolution on the island have been taken into account: a new born initial growth as induced by geographic suitability and the simulation of urban spread after 1943 induced by the population evolution after this date.

  14. Simulating human behavior for national security human interactions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.

    2007-01-01

    This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less

  15. Evidence for the need of realistic radio communications for airline pilot simulator training and evaluation

    DOT National Transportation Integrated Search

    2003-11-05

    This paper presents arguments in favor of realistic representation of radio communications during training and evaluation of airline pilots in the simulator. A survey of airlines showed that radio communications are mainly role-played by Instructor/E...

  16. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  17. Probabilistic Evaluation of Ecological and Economic Objectives of River Basin Management Reveals a Potential Flaw in the Goal Setting of the EU Water Framework Directive

    NASA Astrophysics Data System (ADS)

    Hjerppe, Turo; Taskinen, Antti; Kotamäki, Niina; Malve, Olli; Kettunen, Juhani

    2017-04-01

    The biological status of European lakes has not improved as expected despite up-to-date legislation and ecological standards. As a result, the realism of objectives and the attainment of related ecological standards are under doubt. This paper gets to the bottom of a river basin management plan of a eutrophic lake in Finland and presents the ecological and economic impacts of environmental and societal drivers and planned management measures. For these purposes, we performed a Monte Carlo simulation of a diffuse nutrient load, lake water quality and cost-benefit models. Simulations were integrated into a Bayesian influence diagram that revealed the basic uncertainties. It turned out that the attainment of good ecological status as qualified in the Water Framework Directive of the European Union is unlikely within given socio-economic constraints. Therefore, management objectives and ecological and economic standards need to be reassessed and reset to provide a realistic goal setting for management. More effort should be put into the evaluation of the total monetary benefits and on the monitoring of lake phosphorus balances to reduce the uncertainties, and the resulting margin of safety and costs and risks of planned management measures.

  18. Probabilistic Evaluation of Ecological and Economic Objectives of River Basin Management Reveals a Potential Flaw in the Goal Setting of the EU Water Framework Directive.

    PubMed

    Hjerppe, Turo; Taskinen, Antti; Kotamäki, Niina; Malve, Olli; Kettunen, Juhani

    2017-04-01

    The biological status of European lakes has not improved as expected despite up-to-date legislation and ecological standards. As a result, the realism of objectives and the attainment of related ecological standards are under doubt. This paper gets to the bottom of a river basin management plan of a eutrophic lake in Finland and presents the ecological and economic impacts of environmental and societal drivers and planned management measures. For these purposes, we performed a Monte Carlo simulation of a diffuse nutrient load, lake water quality and cost-benefit models. Simulations were integrated into a Bayesian influence diagram that revealed the basic uncertainties. It turned out that the attainment of good ecological status as qualified in the Water Framework Directive of the European Union is unlikely within given socio-economic constraints. Therefore, management objectives and ecological and economic standards need to be reassessed and reset to provide a realistic goal setting for management. More effort should be put into the evaluation of the total monetary benefits and on the monitoring of lake phosphorus balances to reduce the uncertainties, and the resulting margin of safety and costs and risks of planned management measures.

  19. Health/Cosmetology. Career Education Guide.

    ERIC Educational Resources Information Center

    Dependents Schools (DOD), Washington, DC. European Area.

    The curriculum guide is designed to provide students with realistic training in theory and practice within the secondary educational framework and prepare them for entry into an occupation or continuing postsecondary education. The learning modules are grouped into branches pertaining to the broad categories of health services and cosmetology.…

  20. Graphic Communications. Career Education Guide.

    ERIC Educational Resources Information Center

    Dependents Schools (DOD), Washington, DC. European Area.

    The curriculum guide is designed to provide students with realistic training in graphic communications theory and practice within the secondary educational framework and to prepare them for entry into an occupation or continuing postsecondary education. The program modules outlined in the guide have been grouped into four areas: printing,…

  1. Equipping community pharmacy workers as agents for health behaviour change: developing and testing a theory-based smoking cessation intervention.

    PubMed

    Steed, Liz; Sohanpal, Ratna; James, Wai-Yee; Rivas, Carol; Jumbe, Sandra; Chater, Angel; Todd, Adam; Edwards, Elizabeth; Macneil, Virginia; Macfarlane, Fraser; Greenhalgh, Trisha; Griffiths, Chris; Eldridge, Sandra; Taylor, Stephanie; Walton, Robert

    2017-08-11

    To develop a complex intervention for community pharmacy staff to promote uptake of smoking cessation services and to increase quit rates. Following the Medical Research Council framework, we used a mixed-methods approach to develop, pilot and then refine the intervention. Phase I : We used information from qualitative studies in pharmacies, systematic literature reviews and the Capability, Opportunity, Motivation-Behaviour framework to inform design of the initial version of the intervention. Phase II : We then tested the acceptability of this intervention with smoking cessation advisers and assessed fidelity using actors who visited pharmacies posing as smokers, in a pilot study. Phase III : We reviewed the content and associated theory underpinning our intervention, taking account of the results of the earlier studies and a realist analysis of published literature. We then confirmed a logic model describing the intended operation of the intervention and used this model to refine the intervention and associated materials. Eight community pharmacies in three inner east London boroughs. 12 Stop Smoking Advisers. Two, 150 min, skills-based training sessions focused on communication and behaviour change skills with between session practice. The pilot study confirmed acceptability of the intervention and showed preliminary evidence of benefit; however, organisational barriers tended to limit effective operation. The pilot data and realist review pointed to additional use of Diffusion of Innovations Theory to seat the intervention in the wider organisational context. We have developed and refined an intervention to promote smoking cessation services in community pharmacies, which we now plan to evaluate in a randomised controlled trial. UKCRN ID 18446, Pilot. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Performance comparison between total variation (TV)-based compressed sensing and statistical iterative reconstruction algorithms.

    PubMed

    Tang, Jie; Nett, Brian E; Chen, Guang-Hong

    2009-10-07

    Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.

  3. Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable?

    PubMed

    Booth, Andrew; Carroll, Christopher

    2015-09-01

    In recognising the potential value of theory in understanding how interventions work comes a challenge - how to make identification of theory less haphazard? To explore the feasibility of systematic identification of theory. We searched PubMed for published reviews (1998-2012) that had explicitly sought to identify theory. Systematic searching may be characterised by a structured question, methodological filters and an itemised search procedure. We constructed a template (BeHEMoTh - Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory. The authors tested the template within two systematic reviews. Of 34 systematic reviews, only 12 reviews (35%) reported a method for identifying theory. Nineteen did not specify how they identified studies containing theory. Data were unavailable for three reviews. Candidate terms include concept(s)/conceptual, framework(s), model(s), and theory/theories/theoretical. Information professionals must overcome inadequate reporting and the use of theory out of context. The review team faces an additional concern in lack of 'theory fidelity'. Based on experience with two systematic reviews, the BeHEMoTh template and procedure offers a feasible and useful approach for identification of theory. Applications include realist synthesis, framework synthesis or review of complex interventions. The procedure requires rigorous evaluation. © 2015 Health Libraries Group.

  4. Automatic design of synthetic gene circuits through mixed integer non-linear programming.

    PubMed

    Huynh, Linh; Kececioglu, John; Köppe, Matthias; Tagkopoulos, Ilias

    2012-01-01

    Automatic design of synthetic gene circuits poses a significant challenge to synthetic biology, primarily due to the complexity of biological systems, and the lack of rigorous optimization methods that can cope with the combinatorial explosion as the number of biological parts increases. Current optimization methods for synthetic gene design rely on heuristic algorithms that are usually not deterministic, deliver sub-optimal solutions, and provide no guaranties on convergence or error bounds. Here, we introduce an optimization framework for the problem of part selection in synthetic gene circuits that is based on mixed integer non-linear programming (MINLP), which is a deterministic method that finds the globally optimal solution and guarantees convergence in finite time. Given a synthetic gene circuit, a library of characterized parts, and user-defined constraints, our method can find the optimal selection of parts that satisfy the constraints and best approximates the objective function given by the user. We evaluated the proposed method in the design of three synthetic circuits (a toggle switch, a transcriptional cascade, and a band detector), with both experimentally constructed and synthetic promoter libraries. Scalability and robustness analysis shows that the proposed framework scales well with the library size and the solution space. The work described here is a step towards a unifying, realistic framework for the automated design of biological circuits.

  5. What Today's Educational Technology Needs: Defensible Evaluations and Realistic Implementation.

    ERIC Educational Resources Information Center

    Roweton, William E.; And Others

    It is argued that in order to make computer assisted instruction effective in the schools, educators should pay more attention to implementation issues (including modifying teacher attitudes, changing classroom routines, and offering realistic technical training and support) and to producing understandable product and performance evaluations.…

  6. Faculty Development for Educators: A Realist Evaluation

    ERIC Educational Resources Information Center

    Sorinola, Olanrewaju O.; Thistlethwaite, Jill; Davies, David; Peile, Ed

    2015-01-01

    The effectiveness of faculty development (FD) activities for educators in UK medical schools remains underexplored. This study used a realist approach to evaluate FD and to test the hypothesis that motivation, engagement and perception are key mechanisms of effective FD activities. The authors observed and interviewed 33 course participants at one…

  7. Realistic Simulations of Coronagraphic Observations with WFIRST

    NASA Astrophysics Data System (ADS)

    Rizzo, Maxime; Zimmerman, Neil; Roberge, Aki; Lincowski, Andrew; Arney, Giada; Stark, Chris; Jansen, Tiffany; Turnbull, Margaret; WFIRST Science Investigation Team (Turnbull)

    2018-01-01

    We present a framework to simulate observing scenarios with the WFIRST Coronagraphic Instrument (CGI). The Coronagraph and Rapid Imaging Spectrograph in Python (crispy) is an open-source package that can be used to create CGI data products for analysis and development of post-processing routines. The software convolves time-varying coronagraphic PSFs with realistic astrophysical scenes which contain a planetary architecture, a consistent dust structure, and a background field composed of stars and galaxies. The focal plane can be read out by a WFIRST electron-multiplying CCD model directly, or passed through a WFIRST integral field spectrograph model first. Several elementary post-processing routines are provided as part of the package.

  8. Theory of precipitation effects on dead cylindrical fuels

    Treesearch

    Michael A. Fosberg

    1972-01-01

    Numerical and analytical solutions of the Fickian diffusion equation were used to determine the effects of precipitation on dead cylindrical forest fuels. The analytical solution provided a physical framework. The numerical solutions were then used to refine the analytical solution through a similarity argument. The theoretical solutions predicted realistic rates of...

  9. Laboratory Biosafety and Biosecurity Risk Assessment Technical Guidance Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astuto-Gribble, Lisa M; Caskey, Susan Adele

    2014-07-01

    The purpose of this document is threefold: 1) to describe the laboratory bio safety and biosecurity risk assessment process and its conceptual framework; 2) provide detailed guidance and suggested methodologies on how to conduct a risk assessment; and 3) present some practical risk assessment process strategies using realistic laboratory scenarios.

  10. IMPORTANT EXPOSURE FACTORS FOR CHILDREN AN ANALYSIS OF LABORATORY AND OBSERVATIONAL FIELD DATA CHARACTERIZING CUMULATIVE EXPOSURE TO PESTICIDES

    EPA Science Inventory

    In an effort to facilitate more realistic risk assessments that take into account unique childhood vulnerabilities to environmental toxicants, the U.S. EPA's National Exposure Research Laboratory (NERL) developed a framework for systematically identifying and addressing the most ...

  11. Business/Clerical/Sales. Career Education Guide.

    ERIC Educational Resources Information Center

    Dependents Schools (DOD), Washington, DC. European Area.

    The curriculum guide is designed to provide students with realistic training in business/clerical/sales theory and practices within the secondary educational framework and to prepare them for entry into an occupation or continuing postsecondary education. Each unit plan consists of a description of the area under consideration, estimated hours of…

  12. Intervention Now to Eliminate Repeat Unintended Pregnancy in Teenagers (INTERUPT): a systematic review of intervention effectiveness and cost-effectiveness, and qualitative and realist synthesis of implementation factors and user engagement.

    PubMed

    Aslam, Rabeea'h W; Hendry, Maggie; Booth, Andrew; Carter, Ben; Charles, Joanna M; Craine, Noel; Edwards, Rhiannon Tudor; Noyes, Jane; Ntambwe, Lupetu Ives; Pasterfield, Diana; Rycroft-Malone, Jo; Williams, Nefyn; Whitaker, Rhiannon

    2017-08-15

    Unintended repeat conceptions can result in emotional, psychological and educational harm to young women, often with enduring implications for their life chances. This study aimed to identify which young women are at the greatest risk of repeat unintended pregnancies; which interventions are effective and cost-effective; and what are the barriers to and facilitators for the uptake of these interventions. We conducted a mixed-methods systematic review which included meta-analysis, framework synthesis and application of realist principles, with stakeholder input and service user feedback to address this. We searched 20 electronic databases, including MEDLINE, Excerpta Medica database, Applied Social Sciences Index and Abstracts and Research Papers in Economics, to cover a broad range of health, social science, health economics and grey literature sources. Searches were conducted between May 2013 and June 2014 and updated in August 2015. Twelve randomised controlled trials (RCTs), two quasi-RCTs, 10 qualitative studies and 53 other quantitative studies were identified. The RCTs evaluated psychosocial interventions and an emergency contraception programme. The primary outcome was repeat conception rate: the event rate was 132 of 308 (43%) in the intervention group versus 140 of 289 (48%) for the control group, with a non-significant risk ratio (RR) of 0.92 [95% confidence interval (CI) 0.78-1.08]. Four studies reported subsequent birth rates: 29 of 237 (12%) events for the intervention arm versus 46 out of 224 (21%) for the control arm, with an RR of 0.60 (95% CI 0.39-0.93). Many repeat conceptions occurred in the context of poverty, low expectations and aspirations and negligible opportunities. Qualitative and realist evidence highlighted the importance of context, motivation, future planning and giving young women a central and active role in the development of new interventions. Little or no evidence for the effectiveness or cost-effectiveness of any of the interventions to reduce repeat pregnancy in young women was found. Qualitative and realist evidence helped to explain gaps in intervention design that should be addressed. More theory-based, rigorously evaluated programmes need to be developed to reduce unintended repeat pregnancy in young women. PROSPERO, CRD42012003168 . Cochrane registration number: i = fertility/0068.

  13. A machine learning evaluation of an artificial immune system.

    PubMed

    Glickman, Matthew; Balthrop, Justin; Forrest, Stephanie

    2005-01-01

    ARTIS is an artificial immune system framework which contains several adaptive mechanisms. LISYS is a version of ARTIS specialized for the problem of network intrusion detection. The adaptive mechanisms of LISYS are characterized in terms of their machine-learning counterparts, and a series of experiments is described, each of which isolates a different mechanism of LISYS and studies its contribution to the system's overall performance. The experiments were conducted on a new data set, which is more recent and realistic than earlier data sets. The network intrusion detection problem is challenging because it requires one-class learning in an on-line setting with concept drift. The experiments confirm earlier experimental results with LISYS, and they study in detail how LISYS achieves success on the new data set.

  14. On the Land-Ocean Contrast of Tropical Convection and Microphysics Statistics Derived from TRMM Satellite Signals and Global Storm-Resolving Models

    NASA Technical Reports Server (NTRS)

    Matsui, Toshihisa; Chern, Jiun-Dar; Tao, Wei-Kuo; Lang, Stephen E.; Satoh, Masaki; Hashino, Tempei; Kubota, Takuji

    2016-01-01

    A 14-year climatology of Tropical Rainfall Measuring Mission (TRMM) collocated multi-sensor signal statistics reveal a distinct land-ocean contrast as well as geographical variability of precipitation type, intensity, and microphysics. Microphysics information inferred from the TRMM precipitation radar and Microwave Imager (TMI) show a large land-ocean contrast for the deep category, suggesting continental convective vigor. Over land, TRMM shows higher echo-top heights and larger maximum echoes, suggesting taller storms and more intense precipitation, as well as larger microwave scattering, suggesting the presence of morelarger frozen convective hydrometeors. This strong land-ocean contrast in deep convection is invariant over seasonal and multi-year time-scales. Consequently, relatively short-term simulations from two global storm-resolving models can be evaluated in terms of their land-ocean statistics using the TRMM Triple-sensor Three-step Evaluation via a satellite simulator. The models evaluated are the NASA Multi-scale Modeling Framework (MMF) and the Non-hydrostatic Icosahedral Cloud Atmospheric Model (NICAM). While both simulations can represent convective land-ocean contrasts in warm precipitation to some extent, near-surface conditions over land are relatively moisture in NICAM than MMF, which appears to be the key driver in the divergent warm precipitation results between the two models. Both the MMF and NICAM produced similar frequencies of large CAPE between land and ocean. The dry MMF boundary layer enhanced microwave scattering signals over land, but only NICAM had an enhanced deep convection frequency over land. Neither model could reproduce a realistic land-ocean contrast in in deep convective precipitation microphysics. A realistic contrast between land and ocean remains an issue in global storm-resolving modeling.

  15. A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model

    NASA Astrophysics Data System (ADS)

    Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi

    Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.

  16. Effects of Field of View and Visual Complexity on Virtual Reality Training Effectiveness for a Visual Scanning Task

    DOE PAGES

    Ragan, Eric D.; Bowman, Doug A.; Kopper, Regis; ...

    2015-02-13

    Virtual reality training systems are commonly used in a variety of domains, and it is important to understand how the realism of a training simulation influences training effectiveness. The paper presents a framework for evaluating the effects of virtual reality fidelity based on an analysis of a simulation’s display, interaction, and scenario components. Following this framework, we conducted a controlled experiment to test the effects of fidelity on training effectiveness for a visual scanning task. The experiment varied the levels of field of view and visual realism during a training phase and then evaluated scanning performance with the simulator’s highestmore » level of fidelity. To assess scanning performance, we measured target detection and adherence to a prescribed strategy. The results show that both field of view and visual realism significantly affected target detection during training; higher field of view led to better performance and higher visual realism worsened performance. Additionally, the level of visual realism during training significantly affected learning of the prescribed visual scanning strategy, providing evidence that high visual realism was important for learning the technique. The results also demonstrate that task performance during training was not always a sufficient measure of mastery of an instructed technique. That is, if learning a prescribed strategy or skill is the goal of a training exercise, performance in a simulation may not be an appropriate indicator of effectiveness outside of training—evaluation in a more realistic setting may be necessary.« less

  17. How to Measure Costs and Benefits of eHealth Interventions: An Overview of Methods and Frameworks

    PubMed Central

    2015-01-01

    Information on the costs and benefits of eHealth interventions is needed, not only to document value for money and to support decision making in the field, but also to form the basis for developing business models and to facilitate payment systems to support large-scale services. In the absence of solid evidence of its effects, key decision makers may doubt the effectiveness, which, in turn, limits investment in, and the long-term integration of, eHealth services. However, it is not realistic to conduct economic evaluations of all eHealth applications and services in all situations, so we need to be able to generalize from those we do conduct. This implies that we have to select the most appropriate methodology and data collection strategy in order to increase the transferability across evaluations. This paper aims to contribute to the understanding of how to apply economic evaluation methodology in the eHealth field. It provides a brief overview of basic health economics principles and frameworks and discusses some methodological issues and challenges in conducting cost-effectiveness analysis of eHealth interventions. Issues regarding the identification, measurement, and valuation of costs and benefits are outlined. Furthermore, this work describes the established techniques of combining costs and benefits, presents the decision rules for identifying the preferred option, and outlines approaches to data collection strategies. Issues related to transferability and complexity are also discussed. PMID:26552360

  18. A Realistic Seizure Prediction Study Based on Multiclass SVM.

    PubMed

    Direito, Bruno; Teixeira, César A; Sales, Francisco; Castelo-Branco, Miguel; Dourado, António

    2017-05-01

    A patient-specific algorithm, for epileptic seizure prediction, based on multiclass support-vector machines (SVM) and using multi-channel high-dimensional feature sets, is presented. The feature sets, combined with multiclass classification and post-processing schemes aim at the generation of alarms and reduced influence of false positives. This study considers 216 patients from the European Epilepsy Database, and includes 185 patients with scalp EEG recordings and 31 with intracranial data. The strategy was tested over a total of 16,729.80[Formula: see text]h of inter-ictal data, including 1206 seizures. We found an overall sensitivity of 38.47% and a false positive rate per hour of 0.20. The performance of the method achieved statistical significance in 24 patients (11% of the patients). Despite the encouraging results previously reported in specific datasets, the prospective demonstration on long-term EEG recording has been limited. Our study presents a prospective analysis of a large heterogeneous, multicentric dataset. The statistical framework based on conservative assumptions, reflects a realistic approach compared to constrained datasets, and/or in-sample evaluations. The improvement of these results, with the definition of an appropriate set of features able to improve the distinction between the pre-ictal and nonpre-ictal states, hence minimizing the effect of confounding variables, remains a key aspect.

  19. Must Metaethical Realism Make a Semantic Claim?

    PubMed

    Kahane, Guy

    2013-02-01

    Mackie drew attention to the distinct semantic and metaphysical claims made by meta ethical realists, arguing that although our evaluative discourse is cognitive and objective, there are no objective evaluative facts. This distinction, however, also opens up a reverse possibility: that our evaluative discourse is antirealist, yet objective values do exist. I suggest that this seemingly far-fetched possibility merits serious attention; realism seems com mitted to its intelligibility, and, despite appearances, it isn't incoherent, ineffable, inherently implausible or impossible to defend. I argue that reflection on this possibility should lead us to revise our understanding of the debate between realists and antirealists. It is not only that the realist's semantic claim is insufficient for realism to be true, as Mackie argued; it's not even necessary. Robust metaethical realism is best understood as making a purely metaphysical claim. It is thus not enough for antirealists to show that our discourse is antirealist. They must directly attack the realist's metaphysical claim.

  20. A unified material decomposition framework for quantitative dual- and triple-energy CT imaging.

    PubMed

    Zhao, Wei; Vernekohl, Don; Han, Fei; Han, Bin; Peng, Hao; Yang, Yong; Xing, Lei; Min, James K

    2018-04-21

    Many clinical applications depend critically on the accurate differentiation and classification of different types of materials in patient anatomy. This work introduces a unified framework for accurate nonlinear material decomposition and applies it, for the first time, in the concept of triple-energy CT (TECT) for enhanced material differentiation and classification as well as dual-energy CT (DECT). We express polychromatic projection into a linear combination of line integrals of material-selective images. The material decomposition is then turned into a problem of minimizing the least-squares difference between measured and estimated CT projections. The optimization problem is solved iteratively by updating the line integrals. The proposed technique is evaluated by using several numerical phantom measurements under different scanning protocols. The triple-energy data acquisition is implemented at the scales of micro-CT and clinical CT imaging with commercial "TwinBeam" dual-source DECT configuration and a fast kV switching DECT configuration. Material decomposition and quantitative comparison with a photon counting detector and with the presence of a bow-tie filter are also performed. The proposed method provides quantitative material- and energy-selective images examining realistic configurations for both DECT and TECT measurements. Compared to the polychromatic kV CT images, virtual monochromatic images show superior image quality. For the mouse phantom, quantitative measurements show that the differences between gadodiamide and iodine concentrations obtained using TECT and idealized photon counting CT (PCCT) are smaller than 8 and 1 mg/mL, respectively. TECT outperforms DECT for multicontrast CT imaging and is robust with respect to spectrum estimation. For the thorax phantom, the differences between the concentrations of the contrast map and the corresponding true reference values are smaller than 7 mg/mL for all of the realistic configurations. A unified framework for both DECT and TECT imaging has been established for the accurate extraction of material compositions using currently available commercial DECT configurations. The novel technique is promising to provide an urgently needed solution for several CT-based diagnostic and therapy applications, especially for the diagnosis of cardiovascular and abdominal diseases where multicontrast imaging is involved. © 2018 American Association of Physicists in Medicine.

  1. A New Realistic Evaluation Analysis Method: Linked Coding of Context, Mechanism, and Outcome Relationships

    ERIC Educational Resources Information Center

    Jackson, Suzanne F.; Kolla, Gillian

    2012-01-01

    In attempting to use a realistic evaluation approach to explore the role of Community Parents in early parenting programs in Toronto, a novel technique was developed to analyze the links between contexts (C), mechanisms (M) and outcomes (O) directly from experienced practitioner interviews. Rather than coding the interviews into themes in terms of…

  2. Realist Evaluation in Wraparound: A New Approach in Social Work Evidence-Based Practice

    ERIC Educational Resources Information Center

    Kazi, Mansoor A. F.; Pagkos, Brian; Milch, Heidi A.

    2011-01-01

    Objectives: The purpose of this study was to develop a realist evaluation paradigm in social work evidence-based practice. Method: Wraparound (at Gateway-Longview Inc., New York) used a reliable outcome measure and an electronic database to systematically collect and analyze data on the interventions, the client demographics and circumstances, and…

  3. Problem-Based Learning in Instrumentation: Synergism of Real and Virtual Modular Acquisition Chains

    ERIC Educational Resources Information Center

    Nonclercq, A.; Biest, A. V.; De Cuyper, K.; Leroy, E.; Martinez, D. L.; Robert, F.

    2010-01-01

    As part of an instrumentation course, a problem-based learning framework was selected for laboratory instruction. Two acquisition chains were designed to help students carry out realistic instrumentation problems. The first tool is a virtual (simulated) modular acquisition chain that allows rapid overall understanding of the main problems in…

  4. Closing the Attainment Gap--A Realistic Proposition or an Elusive Pipe-Dream?

    ERIC Educational Resources Information Center

    Mowat, Joan Gaynor

    2018-01-01

    The attainment gap associated with socio-economic status is an international problem that is highly resistant to change. This conceptual paper critiques the drive by the Scottish Government to address the attainment gap through the Scottish Attainment Challenge and the National Improvement Framework. It draws upon a range of theoretical…

  5. Remote sensing data assimilation for a prognostic phenology model

    Treesearch

    R. Stockli; T. Rutishauser; D. Dragoni; J. O' Keefe; P. E. Thornton; M. Jolly; L. Lu; A. S. Denning

    2008-01-01

    Predicting the global carbon and water cycle requires a realistic representation of vegetation phenology in climate models. However most prognostic phenology models are not yet suited for global applications, and diagnostic satellite data can be uncertain and lack predictive power. We present a framework for data assimilation of Fraction of Photosynthetically Active...

  6. A Warping Framework for Wide-Angle Imaging and Perspective Manipulation

    ERIC Educational Resources Information Center

    Carroll, Robert E.

    2013-01-01

    Nearly all photographs are created with lenses that approximate an ideal pinhole camera--that is, a perspective projection. This projection has proven useful not only for creating realistic depictions, but also for its expressive flexibility. Beginning in the Renaissance, the notion of perspective gave artists a systematic way to represent…

  7. STEM and Model-Eliciting Activities: Responsive Professional Development for K-8 Mathematics Coaches

    ERIC Educational Resources Information Center

    Baker, Courtney; Galanti, Terrie; Birkhead, Sara

    2017-01-01

    This research highlights a university-school division collaboration to pilot a professional development framework for integrating STEM in K-8 mathematics classrooms. The university researchers worked with mathematics coaches to construct a realistic and reasonable vision of STEM integration built upon the design principles of model-eliciting…

  8. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  9. An Integrated Decision-Making Framework for Sustainability Assessment: A Case Study of Memorial University

    ERIC Educational Resources Information Center

    Waheed, Bushra; Khan, Faisal; Veitch, Brian; Hawboldt, Kelly

    2011-01-01

    This article presents an overview of the sustainability initiatives at the St. John's campus of Memorial University in Newfoundland and Labrador (Canada). The key initiatives include setting a realistic goal for energy efficiency, becoming carbon neutral, and conducting various research and outreach projects related to sustainability. As…

  10. PHOTOCHEMICAL SIMULATIONS OF POINT SOURCE EMISSIONS WITH THE MODELS-3 CMAQ PLUME-IN-GRID APPROACH

    EPA Science Inventory

    A plume-in-grid (PinG) approach has been designed to provide a realistic treatment for the simulation the dynamic and chemical processes impacting pollutant species in major point source plumes during a subgrid scale phase within an Eulerian grid modeling framework. The PinG sci...

  11. 76 FR 45221 - Notice of Funding Availability: Inviting Applications for the Food for Progress Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-28

    ...://www.fas.usda.gov/excredits/FoodAid/FFP/FrameworkGuidance.asp . VI. Proposal Review Criteria A. Review... beneficiaries realistic for the proposed activities? (f) Are the beneficiaries and criteria for selection... Applications for the Food for Progress Program Announcement Type: New. Catalog of Federal Domestic Assistance...

  12. Development of a foraging model framework to reliably estimate daily food consumption by young fishes

    USGS Publications Warehouse

    Deslauriers, David; Rosburg, Alex J.; Chipps, Steven R.

    2017-01-01

    We developed a foraging model for young fishes that incorporates handling and digestion rate to estimate daily food consumption. Feeding trials were used to quantify functional feeding response, satiation, and gut evacuation rate. Once parameterized, the foraging model was then applied to evaluate effects of prey type, prey density, water temperature, and fish size on daily feeding rate by age-0 (19–70 mm) pallid sturgeon (Scaphirhynchus albus). Prey consumption was positively related to prey density (for fish >30 mm) and water temperature, but negatively related to prey size and the presence of sand substrate. Model evaluation results revealed good agreement between observed estimates of daily consumption and those predicted by the model (r2 = 0.95). Model simulations showed that fish feeding on Chironomidae or Ephemeroptera larvae were able to gain mass, whereas fish feeding solely on zooplankton lost mass under most conditions. By accounting for satiation and digestive processes in addition to handling time and prey density, the model provides realistic estimates of daily food consumption that can prove useful for evaluating rearing conditions for age-0 fishes.

  13. Failure Impact Analysis of Key Management in AMI Using Cybernomic Situational Assessment (CSA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Hauser, Katie R

    2013-01-01

    In earlier work, we presented a computational framework for quantifying the security of a system in terms of the average loss a stakeholder stands to sustain as a result of threats to the system. We named this system, the Cyberspace Security Econometrics System (CSES). In this paper, we refine the framework and apply it to cryptographic key management within the Advanced Metering Infrastructure (AMI) as an example. The stakeholders, requirements, components, and threats are determined. We then populate the matrices with justified values by addressing the AMI at a higher level, rather than trying to consider every piece of hardwaremore » and software involved. We accomplish this task by leveraging the recently established NISTR 7628 guideline for smart grid security. This allowed us to choose the stakeholders, requirements, components, and threats realistically. We reviewed the literature and selected an industry technical working group to select three representative threats from a collection of 29 threats. From this subset, we populate the stakes, dependency, and impact matrices, and the threat vector with realistic numbers. Each Stakeholder s Mean Failure Cost is then computed.« less

  14. Impact of small groups with heterogeneous preference on behavioral evolution in population evacuation.

    PubMed

    Wang, Tao; Huang, Keke; Wang, Zhen; Zheng, Xiaoping

    2015-01-01

    Up to now, there have been a great number of mechanisms to explain the individual behavior and population traits, which seem of particular significance in evolutionary biology and social behavior analysis. Among them, small groups and heterogeneity are two useful frameworks to the above issue. However, vast majority of existing works separately consider both scenarios, which is inconsistent with realistic cases in our life. Here we propose the evolutionary games of heterogeneous small groups (namely, different small groups possess different preferences to dilemma) to study the collective behavior in population evacuation. Importantly, players usually face completely different dilemmas inside and outside the small groups. By means of numerous computation simulations, it is unveiled that the ratio of players in one certain small group directly decides the final behavior of the whole population. Moreover, it can also be concluded that heterogeneous degree of preference for different small groups plays a key role in the behavior traits of the system, which may validate some realistic social observations. The proposed framework is thus universally applicable and may shed new light into the solution of social dilemmas.

  15. tDCS for Memory Enhancement: Analysis of the Speculative Aspects of Ethical Issues

    PubMed Central

    Voarino, Nathalie; Dubljević, Veljko; Racine, Eric

    2017-01-01

    Transcranial direct current stimulation (tDCS) is a promising technology to enhance cognitive and physical performance. One of the major areas of interest is the enhancement of memory function in healthy individuals. The early arrival of tDCS on the market for lifestyle uses and cognitive enhancement purposes lead to the voicing of some important ethical concerns, especially because, to date, there are no official guidelines or evaluation procedures to tackle these issues. The aim of this article is to review ethical issues related to uses of tDCS for memory enhancement found in the ethics and neuroscience literature and to evaluate how realistic and scientifically well-founded these concerns are? In order to evaluate how plausible or speculative each issue is, we applied the methodological framework described by Racine et al. (2014) for “informed and reflective” speculation in bioethics. This framework could be succinctly presented as requiring: (1) the explicit acknowledgment of factual assumptions and identification of the value attributed to them; (2) the validation of these assumptions with interdisciplinary literature; and (3) the adoption of a broad perspective to support more comprehensive reflection on normative issues. We identified four major considerations associated with the development of tDCS for memory enhancement: safety, autonomy, justice and authenticity. In order to assess the seriousness and likelihood of harm related to each of these concerns, we analyzed the assumptions underlying the ethical issues, and the level of evidence for each of them. We identified seven distinct assumptions: prevalence, social acceptance, efficacy, ideological stance (bioconservative vs. libertarian), potential for misuse, long term side effects, and the delivery of complete and clear information. We conclude that ethical discussion about memory enhancement via tDCS sometimes involves undue speculation, and closer attention to scientific and social facts would bring a more nuanced analysis. At this time, the most realistic concerns are related to safety and violation of users’ autonomy by a breach of informed consent, as potential immediate and long-term health risks to private users remain unknown or not well defined. Clear and complete information about these risks must be provided to research participants and consumers of tDCS products or related services. Broader public education initiatives and warnings would also be worthwhile to reach those who are constructing their own tDCS devices. PMID:28123362

  16. tDCS for Memory Enhancement: Analysis of the Speculative Aspects of Ethical Issues.

    PubMed

    Voarino, Nathalie; Dubljević, Veljko; Racine, Eric

    2016-01-01

    Transcranial direct current stimulation (tDCS) is a promising technology to enhance cognitive and physical performance. One of the major areas of interest is the enhancement of memory function in healthy individuals. The early arrival of tDCS on the market for lifestyle uses and cognitive enhancement purposes lead to the voicing of some important ethical concerns, especially because, to date, there are no official guidelines or evaluation procedures to tackle these issues. The aim of this article is to review ethical issues related to uses of tDCS for memory enhancement found in the ethics and neuroscience literature and to evaluate how realistic and scientifically well-founded these concerns are? In order to evaluate how plausible or speculative each issue is, we applied the methodological framework described by Racine et al. (2014) for "informed and reflective" speculation in bioethics. This framework could be succinctly presented as requiring: (1) the explicit acknowledgment of factual assumptions and identification of the value attributed to them; (2) the validation of these assumptions with interdisciplinary literature; and (3) the adoption of a broad perspective to support more comprehensive reflection on normative issues. We identified four major considerations associated with the development of tDCS for memory enhancement: safety, autonomy, justice and authenticity. In order to assess the seriousness and likelihood of harm related to each of these concerns, we analyzed the assumptions underlying the ethical issues, and the level of evidence for each of them. We identified seven distinct assumptions: prevalence, social acceptance, efficacy, ideological stance (bioconservative vs. libertarian), potential for misuse, long term side effects, and the delivery of complete and clear information. We conclude that ethical discussion about memory enhancement via tDCS sometimes involves undue speculation, and closer attention to scientific and social facts would bring a more nuanced analysis. At this time, the most realistic concerns are related to safety and violation of users' autonomy by a breach of informed consent, as potential immediate and long-term health risks to private users remain unknown or not well defined. Clear and complete information about these risks must be provided to research participants and consumers of tDCS products or related services. Broader public education initiatives and warnings would also be worthwhile to reach those who are constructing their own tDCS devices.

  17. Semantic modeling for theory clarification: The realist vs liberal international relations perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bray, O.H.

    This paper describes a natural language based, semantic information modeling methodology and explores its use and value in clarifying and comparing political science theories and frameworks. As an example, the paper uses this methodology to clarify and compare some of the basic concepts and relationships in the realist (e.g. Waltz) and the liberal (e.g. Rosenau) paradigms for international relations. The methodology can provide three types of benefits: (1) it can clarify and make explicit exactly what is meant by a concept; (2) it can often identify unanticipated implications and consequence of concepts and relationships; and (3) it can help inmore » identifying and operationalizing testable hypotheses.« less

  18. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  19. Simple stochastic model for El Niño with westerly wind bursts

    PubMed Central

    Thual, Sulian; Majda, Andrew J.; Chen, Nan; Stechmann, Samuel N.

    2016-01-01

    Atmospheric wind bursts in the tropics play a key role in the dynamics of the El Niño Southern Oscillation (ENSO). A simple modeling framework is proposed that summarizes this relationship and captures major features of the observational record while remaining physically consistent and amenable to detailed analysis. Within this simple framework, wind burst activity evolves according to a stochastic two-state Markov switching–diffusion process that depends on the strength of the western Pacific warm pool, and is coupled to simple ocean–atmosphere processes that are otherwise deterministic, stable, and linear. A simple model with this parameterization and no additional nonlinearities reproduces a realistic ENSO cycle with intermittent El Niño and La Niña events of varying intensity and strength as well as realistic buildup and shutdown of wind burst activity in the western Pacific. The wind burst activity has a direct causal effect on the ENSO variability: in particular, it intermittently triggers regular El Niño or La Niña events, super El Niño events, or no events at all, which enables the model to capture observed ENSO statistics such as the probability density function and power spectrum of eastern Pacific sea surface temperatures. The present framework provides further theoretical and practical insight on the relationship between wind burst activity and the ENSO. PMID:27573821

  20. An investigation of the role of current and future remote sensing data systems in numerical meteorology

    NASA Technical Reports Server (NTRS)

    Diak, George R.; Smith, William L.

    1993-01-01

    The goals of this research endeavor have been to develop a flexible and relatively complete framework for the investigation of current and future satellite data sources in numerical meteorology. In order to realistically model how satellite information might be used for these purposes, it is necessary that Observing System Simulation Experiments (OSSEs) be as complete as possible. It is therefore desirable that these experiments simulate in entirety the sequence of steps involved in bringing satellite information from the radiance level through product retrieval to a realistic analysis and forecast sequence. In this project we have worked to make this sequence realistic by synthesizing raw satellite data from surrogate atmospheres, deriving satellite products from these data and subsequently producing analyses and forecasts using the retrieved products. The accomplishments made in 1991 are presented. The emphasis was on examining atmospheric soundings and microphysical products which we expect to produce with the launch of the Advanced Microwave Sounding Unit (AMSU), slated for flight in mid 1994.

  1. End-to-end simulation and verification of GNC and robotic systems considering both space segment and ground segment

    NASA Astrophysics Data System (ADS)

    Benninghoff, Heike; Rems, Florian; Risse, Eicke; Brunner, Bernhard; Stelzer, Martin; Krenn, Rainer; Reiner, Matthias; Stangl, Christian; Gnat, Marcin

    2018-01-01

    In the framework of a project called on-orbit servicing end-to-end simulation, the final approach and capture of a tumbling client satellite in an on-orbit servicing mission are simulated. The necessary components are developed and the entire end-to-end chain is tested and verified. This involves both on-board and on-ground systems. The space segment comprises a passive client satellite, and an active service satellite with its rendezvous and berthing payload. The space segment is simulated using a software satellite simulator and two robotic, hardware-in-the-loop test beds, the European Proximity Operations Simulator (EPOS) 2.0 and the OOS-Sim. The ground segment is established as for a real servicing mission, such that realistic operations can be performed from the different consoles in the control room. During the simulation of the telerobotic operation, it is important to provide a realistic communication environment with different parameters like they occur in the real world (realistic delay and jitter, for example).

  2. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  3. A realist evaluation of social prescribing: an exploration into the context and mechanisms underpinning a pathway linking primary care with the voluntary sector.

    PubMed

    Bertotti, Marcello; Frostick, Caroline; Hutt, Patrick; Sohanpal, Ratna; Carnes, Dawn

    2018-05-01

    This article adopts a realist approach to evaluate a social prescribing pilot in the areas of Hackney and City in London (United Kingdom). It unpacks the contextual factors and mechanisms that influenced the development of this pilot for the benefits of GPs, commissioners and practitioners, and reflects on the realist approach to evaluation as a tool for the evaluation of health interventions. Primary care faces considerable challenges including the increase in long-term conditions, GP consultation rates, and widening health inequalities. With its emphasis on linking primary care to non-clinical community services via a social prescribing coordinator (SPC), some models of social prescribing could contribute to reduce the burden on primary care, tackle health inequalities and encourage people to make greater use of non-clinical forms of support. This realist analysis was based on qualitative interviews with users, commissioners, a GP survey, focus groups and learning events to explore stakeholders' experience. To enable a detailed analysis, we adapted the realist approach by subdividing the social prescribing pathway into stages, each with contextual factors, mechanisms and outcomes. SPCs were pivotal to the effective functioning of the social prescribing service and responsible for the activation and initial beneficial impact on users. Although social prescribing shows significant potential for the benefit of patients and primary care, several challenges need to be considered and overcome, including 'buy in' from some GPs, branding, and funding for the third sector in a context where social care cuts are severely affecting the delivery of health care. With its emphasis on context and mechanisms, the realist evaluation approach is useful in understanding how to identify and improve health interventions, and analyse in greater detail the contribution of different stakeholders. As the SPC is central to social prescribing, more needs to be done to understand their role conceptually and practically.

  4. Current Development Status of an Integrated Tool for Modeling Quasi-static Deformation in the Solid Earth

    NASA Astrophysics Data System (ADS)

    Williams, C. A.; Dicaprio, C.; Simons, M.

    2003-12-01

    With the advent of projects such as the Plate Boundary Observatory and future InSAR missions, spatially dense geodetic data of high quality will provide an increasingly detailed picture of the movement of the earth's surface. To interpret such information, powerful and easily accessible modeling tools are required. We are presently developing such a tool that we feel will meet many of the needs for evaluating quasi-static earth deformation. As a starting point, we begin with a modified version of the finite element code TECTON, which has been specifically designed to solve tectonic problems involving faulting and viscoelastic/plastic earth behavior. As our first priority, we are integrating the code into the GeoFramework, which is an extension of the Python-based Pyre modeling framework. The goal of this framework is to provide simplified user interfaces for powerful modeling codes, to provide easy access to utilities such as meshers and visualization tools, and to provide a tight integration between different modeling tools so they can interact with each other. The initial integration of the code into this framework is essentially complete, and a more thorough integration, where Python-based drivers control the entire solution, will be completed in the near future. We have an evolving set of priorities that we expect to solidify as we receive more input from the modeling community. Current priorities include the development of linear and quadratic tetrahedral elements, the development of a parallelized version of the code using the PETSc libraries, the addition of more complex rheologies, realistic fault friction models, adaptive time stepping, and spherical geometries. In this presentation we describe current progress toward our various priorities, briefly describe the structure of the code within the GeoFramework, and demonstrate some sample applications.

  5. Automatic Design of Synthetic Gene Circuits through Mixed Integer Non-linear Programming

    PubMed Central

    Huynh, Linh; Kececioglu, John; Köppe, Matthias; Tagkopoulos, Ilias

    2012-01-01

    Automatic design of synthetic gene circuits poses a significant challenge to synthetic biology, primarily due to the complexity of biological systems, and the lack of rigorous optimization methods that can cope with the combinatorial explosion as the number of biological parts increases. Current optimization methods for synthetic gene design rely on heuristic algorithms that are usually not deterministic, deliver sub-optimal solutions, and provide no guaranties on convergence or error bounds. Here, we introduce an optimization framework for the problem of part selection in synthetic gene circuits that is based on mixed integer non-linear programming (MINLP), which is a deterministic method that finds the globally optimal solution and guarantees convergence in finite time. Given a synthetic gene circuit, a library of characterized parts, and user-defined constraints, our method can find the optimal selection of parts that satisfy the constraints and best approximates the objective function given by the user. We evaluated the proposed method in the design of three synthetic circuits (a toggle switch, a transcriptional cascade, and a band detector), with both experimentally constructed and synthetic promoter libraries. Scalability and robustness analysis shows that the proposed framework scales well with the library size and the solution space. The work described here is a step towards a unifying, realistic framework for the automated design of biological circuits. PMID:22536398

  6. A framework to predict the impacts of shale gas infrastructures on the forest fragmentation of an agroforest region.

    PubMed

    Racicot, Alexandre; Babin-Roussel, Véronique; Dauphinais, Jean-François; Joly, Jean-Sébastien; Noël, Pascal; Lavoie, Claude

    2014-05-01

    We propose a framework to facilitate the evaluation of the impacts of shale gas infrastructures (well pads, roads, and pipelines) on land cover features, especially with regards to forest fragmentation. We used a geographic information system and realistic development scenarios largely inspired by the PA (United States) experience, but adapted to a region of QC (Canada) with an already fragmented forest cover and a high gas potential. The scenario with the greatest impact results from development limited by regulatory constraints only, with no access to private roads for connecting well pads to the public road network. The scenario with the lowest impact additionally integrates ecological constraints (deer yards, maple woodlots, and wetlands). Overall the differences between these two scenarios are relatively minor, with <1 % of the forest cover lost in each case. However, large areas of core forests would be lost in both scenarios and the number of forest patches would increase by 13-21 % due to fragmentation. The pipeline network would have a much greater footprint on the land cover than access roads. Using data acquired since the beginning of the shale gas industry, we show that it is possible, within a reasonable time frame, to produce a robust assessment of the impacts of shale gas extraction. The framework we propose could easily be applied to other contexts or jurisdictions.

  7. CyberMedVPS: visual programming for development of simulators.

    PubMed

    Morais, Aline M; Machado, Liliane S

    2011-01-01

    Computer applications based on Virtual Reality (VR) has been outstanding in training and teaching in the medical filed due to their ability to simulate realistic in which users can practice skills and decision making in different situations. But was realized in these frameworks a hard interaction of non-programmers users. Based on this problematic will be shown the CyberMedVPS, a graphical module which implement Visual Programming concepts to solve an interaction trouble. Frameworks to develop such simulators are available but their use demands knowledge of programming. Based on this problematic will be shown the CyberMedVPS, a graphical module for the CyberMed framework, which implements Visual Programming concepts to allow the development of simulators by non-programmers professionals of the medical field.

  8. Toxicity of environmentally realistic concentrations of chlorpyrifos and terbuthylazine in indoor microcosms.

    PubMed

    Pereira, Ana Santos; Cerejeira, Maria José; Daam, Michiel A

    2017-09-01

    Few studies have been conducted into the evaluation of environmentally realistic pesticide mixtures using model ecosystems. In the present study, the effects of single and combined environmentally realistic concentrations of the herbicide terbuthylazine and the insecticide chlorpyrifos were evaluated using laboratory microcosms. Direct toxic effects of chlorpyrifos were noted on copepod nauplii and cladocerans and the recovery of the latter was likely related with the decrease observed in rotifer abundances. Terbuthylazine potentiated the effect of chlorpyrifos on feeding rates of Daphnia magna, presumably by triggering the transformation of chlorpyrifos to more toxic oxon-analogs. Possible food-web interactions resulting from multiple chemical (and other) stressors likely to be present in edge-of-field water bodies need to be further evaluated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Combining rules, background knowledge and change patterns to maintain semantic annotations.

    PubMed

    Cardoso, Silvio Domingos; Chantal, Reynaud-Delaître; Da Silveira, Marcos; Pruski, Cédric

    2017-01-01

    Knowledge Organization Systems (KOS) play a key role in enriching biomedical information in order to make it machine-understandable and shareable. This is done by annotating medical documents, or more specifically, associating concept labels from KOS with pieces of digital information, e.g., images or texts. However, the dynamic nature of KOS may impact the annotations, thus creating a mismatch between the evolved concept and the associated information. To solve this problem, methods to maintain the quality of the annotations are required. In this paper, we define a framework based on rules, background knowledge and change patterns to drive the annotation adaption process. We evaluate experimentally the proposed approach in realistic cases-studies and demonstrate the overall performance of our approach in different KOS considering the precision, recall, F1-score and AUC value of the system.

  10. Combining rules, background knowledge and change patterns to maintain semantic annotations

    PubMed Central

    Cardoso, Silvio Domingos; Chantal, Reynaud-Delaître; Da Silveira, Marcos; Pruski, Cédric

    2017-01-01

    Knowledge Organization Systems (KOS) play a key role in enriching biomedical information in order to make it machine-understandable and shareable. This is done by annotating medical documents, or more specifically, associating concept labels from KOS with pieces of digital information, e.g., images or texts. However, the dynamic nature of KOS may impact the annotations, thus creating a mismatch between the evolved concept and the associated information. To solve this problem, methods to maintain the quality of the annotations are required. In this paper, we define a framework based on rules, background knowledge and change patterns to drive the annotation adaption process. We evaluate experimentally the proposed approach in realistic cases-studies and demonstrate the overall performance of our approach in different KOS considering the precision, recall, F1-score and AUC value of the system. PMID:29854115

  11. Evaluations on the potential productivity of winter wheat based on agro-ecological zone in the world

    NASA Astrophysics Data System (ADS)

    Wang, H.; Li, Q.; Du, X.; Zhao, L.; Lu, Y.; Li, D.; Liu, J.

    2015-04-01

    Wheat is the most widely grown crop globally and an essential source of calories in human diets. Maintaining and increasing global wheat production is therefore strongly linked to food security. In this paper, the evaluation model of winter wheat potential productivity was proposed based on agro-ecological zone and the historical winter wheat yield data in recent 30 years (1983-2011) obtained from FAO. And the potential productions of winter wheat in the world were investigated. The results showed that the realistic potential productivity of winter wheat in Western Europe was highest and it was more than 7500 kg/hm2. The realistic potential productivity of winter wheat in North China Plain were also higher, which was about 6000 kg/hm2. However, the realistic potential productivity of winter wheat in the United States which is the main winter wheat producing country were not high, only about 3000 kg/hm2. In addition to these regions which were the main winter wheat producing areas, the realistic potential productivity in other regions of the world were very low and mainly less than 1500 kg/hm2, like in southwest region of Russia. The gaps between potential productivity and realistic productivity of winter wheat in Kazakhstan and India were biggest, and the percentages of the gap in realistic productivity of winter wheat in Kazakhstan and India were more than 40%. In Russia, the gap between potential productivity and realistic productivity of winter wheat was lowest and the percentage of the gap in realistic productivity of winter wheat in Russia was only 10%.

  12. Unified Bayesian Estimator of EEG Reference at Infinity: rREST (Regularized Reference Electrode Standardization Technique)

    PubMed Central

    Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A.

    2018-01-01

    The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs—with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the “oracle” choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance. PMID:29780302

  13. Critical Realism and School Effectiveness Research in Colombia: The Difference It Should Make

    ERIC Educational Resources Information Center

    Parra, Juan David

    2018-01-01

    This article draws on the case of academic work produced by Colombian scholars, to address the debate on the persistent failure of policy efforts to improve school effectiveness. Realist meta-theory plays a significant role in this research, because it provides a general framework to identify ontological problems and inconsistencies in empirical…

  14. The dynamics of recreation participation: ski touring in Minnesota

    Treesearch

    Timothy B. Knopp; G. Ballman; L. C. Merriam

    1980-01-01

    A realistic model or framework for the analysis of recreation behavior must be both comprehensive and dynamic. Most attempts to explain recreation behavior are static in that they do not allow for changes in the character of an activity or the evolution of a participant's involvement. Even predictive models tend to assume that relationships remain constant over...

  15. Why Are Teachers Absent? Utilising the Capability Approach and Critical Realism to Explain Teacher Performance in Tanzania

    ERIC Educational Resources Information Center

    Tao, Sharon

    2013-01-01

    Tanzanian teachers have been criticised for a variety of behaviours such as absenteeism, lack of preparation and rote-teaching. This paper introduces an analytical framework that attempts to provide explanations for these behaviours by locating Capability Approach concepts within a Critical Realist theory of causation. Qualitative data from three…

  16. Improving Quality of Life Outcomes in Supported Accommodation for People with Intellectual Disability: What Makes a Difference?

    ERIC Educational Resources Information Center

    Bigby, Christine; Beadle-Brown, Julie

    2018-01-01

    Background: The quality of life (QOL) of people with intellectual disability living in supported accommodation services is variable, influenced by many possible factors. Various frameworks have attempted to identify these factors without assigning value, direction of influence or relative impact on outcomes. Methods: A realist review of the…

  17. National Incorporation of Global Human Rights: Worldwide Expansion of National Human Rights Institutions, 1966-2004

    ERIC Educational Resources Information Center

    Koo, Jeong-Woo; Ramirez, Francisco O.

    2009-01-01

    Using an event history framework we analyze the adoption rate of national human rights institutions. Neo-realist perspective predicts adoption rates to be positively influenced by favorable national profiles that lower the costs and make it more reasonable to establish these institutions. From a world polity perspective adoption rates will be…

  18. Towards an interactive electromechanical model of the heart

    PubMed Central

    Talbot, Hugo; Marchesseau, Stéphanie; Duriez, Christian; Sermesant, Maxime; Cotin, Stéphane; Delingette, Hervé

    2013-01-01

    In this work, we develop an interactive framework for rehearsal of and training in cardiac catheter ablation, and for planning cardiac resynchronization therapy. To this end, an interactive and real-time electrophysiology model of the heart is developed to fit patient-specific data. The proposed interactive framework relies on two main contributions. First, an efficient implementation of cardiac electrophysiology is proposed, using the latest graphics processing unit computing techniques. Second, a mechanical simulation is then coupled to the electrophysiological signals to produce realistic motion of the heart. We demonstrate that pathological mechanical and electrophysiological behaviour can be simulated. PMID:24427533

  19. Lorentz Symmetry Violations from Matter-Gravity Couplings with Lunar Laser Ranging

    NASA Astrophysics Data System (ADS)

    Bourgoin, A.; Le Poncin-Lafitte, C.; Hees, A.; Bouquillon, S.; Francou, G.; Angonin, M.-C.

    2017-11-01

    The standard-model extension (SME) is an effective field theory framework aiming at parametrizing any violation to the Lorentz symmetry (LS) in all sectors of physics. In this Letter, we report the first direct experimental measurement of SME coefficients performed simultaneously within two sectors of the SME framework using lunar laser ranging observations. We consider the pure gravitational sector and the classical point-mass limit in the matter sector of the minimal SME. We report no deviation from general relativity and put new realistic stringent constraints on LS violations improving up to 3 orders of magnitude previous estimations.

  20. Auditory steady state responses and cochlear implants: Modeling the artifact-response mixture in the perspective of denoising.

    PubMed

    Mina, Faten; Attina, Virginie; Duroc, Yvan; Veuillet, Evelyne; Truy, Eric; Thai-Van, Hung

    2017-01-01

    Auditory steady state responses (ASSRs) in cochlear implant (CI) patients are contaminated by the spread of a continuous CI electrical stimulation artifact. The aim of this work was to model the electrophysiological mixture of the CI artifact and the corresponding evoked potentials on scalp electrodes in order to evaluate the performance of denoising algorithms in eliminating the CI artifact in a controlled environment. The basis of the proposed computational framework is a neural mass model representing the nodes of the auditory pathways. Six main contributors to auditory evoked potentials from the cochlear level and up to the auditory cortex were taken into consideration. The simulated dynamics were then projected into a 3-layer realistic head model. 32-channel scalp recordings of the CI artifact-response were then generated by solving the electromagnetic forward problem. As an application, the framework's simulated 32-channel datasets were used to compare the performance of 4 commonly used Independent Component Analysis (ICA) algorithms: infomax, extended infomax, jade and fastICA in eliminating the CI artifact. As expected, two major components were detectable in the simulated datasets, a low frequency component at the modulation frequency and a pulsatile high frequency component related to the stimulation frequency. The first can be attributed to the phase-locked ASSR and the second to the stimulation artifact. Among the ICA algorithms tested, simulations showed that infomax was the most efficient and reliable in denoising the CI artifact-response mixture. Denoising algorithms can induce undesirable deformation of the signal of interest in real CI patient recordings. The proposed framework is a valuable tool for evaluating these algorithms in a controllable environment ahead of experimental or clinical applications.

  1. A novel medical image data-based multi-physics simulation platform for computational life sciences.

    PubMed

    Neufeld, Esra; Szczerba, Dominik; Chavannes, Nicolas; Kuster, Niels

    2013-04-06

    Simulating and modelling complex biological systems in computational life sciences requires specialized software tools that can perform medical image data-based modelling, jointly visualize the data and computational results, and handle large, complex, realistic and often noisy anatomical models. The required novel solvers must provide the power to model the physics, biology and physiology of living tissue within the full complexity of the human anatomy (e.g. neuronal activity, perfusion and ultrasound propagation). A multi-physics simulation platform satisfying these requirements has been developed for applications including device development and optimization, safety assessment, basic research, and treatment planning. This simulation platform consists of detailed, parametrized anatomical models, a segmentation and meshing tool, a wide range of solvers and optimizers, a framework for the rapid development of specialized and parallelized finite element method solvers, a visualization toolkit-based visualization engine, a Python scripting interface for customized applications, a coupling framework, and more. Core components are cross-platform compatible and use open formats. Several examples of applications are presented: hyperthermia cancer treatment planning, tumour growth modelling, evaluating the magneto-haemodynamic effect as a biomarker and physics-based morphing of anatomical models.

  2. Generating realistic environments for cyber operations development, testing, and training

    NASA Astrophysics Data System (ADS)

    Berk, Vincent H.; Gregorio-de Souza, Ian; Murphy, John P.

    2012-06-01

    Training eective cyber operatives requires realistic network environments that incorporate the structural and social complexities representative of the real world. Network trac generators facilitate repeatable experiments for the development, training and testing of cyber operations. However, current network trac generators, ranging from simple load testers to complex frameworks, fail to capture the realism inherent in actual environments. In order to improve the realism of network trac generated by these systems, it is necessary to quantitatively measure the level of realism in generated trac with respect to the environment being mimicked. We categorize realism measures into statistical, content, and behavioral measurements, and propose various metrics that can be applied at each level to indicate how eectively the generated trac mimics the real world.

  3. Framework for National Flood Risk Assessment for Canada

    NASA Astrophysics Data System (ADS)

    Elshorbagy, A. A.; Raja, B.; Lakhanpal, A.; Razavi, S.; Ceola, S.; Montanari, A.

    2016-12-01

    Worldwide, floods have been identified as a standout amongst the most widely recognized catastrophic events, resulting in the loss of life and property. These natural hazards cannot be avoided, but their consequences can certainly be reduced by having prior knowledge of their occurrence and impact. In the context of floods, the terms occurrence and impact are substituted by flood hazard and flood vulnerability, respectively, which collectively define the flood risk. There is a high need for identifying the flood-prone areas and to quantify the risk associated with them. The present study aims at delivering flood risk maps, which prioritize the potential flood risk areas in Canada. The methodology adopted in this study involves integrating various available spatial datasets such as nightlights satellite imagery, land use, population and the digital elevation model, to build a flexible framework for national flood risk assessment for Canada. The flood risk framework assists in identifying the flood-prone areas and evaluating the associated risk. All these spatial datasets were brought to a common GIS platform for flood risk analysis. The spatial datasets deliver the socioeconomic and topographical information that is required for evaluating the flood vulnerability and flood hazard, respectively. Nightlights have been investigated as a tool to be used as a proxy for the human activities to identify areas with regard to economic investment. However, other datasets, including existing flood protection measures, we added to identify a realistic flood assessment framework. Furthermore, the city of Calgary was used as an example to investigate the effect of using Digital Elevation Models (DEMs) of varying resolutions on risk maps. Along with this, the risk map for the city was further enhanced by including the population data to give a social dimension to the risk map. Flood protection measures play a major role by significantly reducing the flood risk of events with a specific return period. An analysis to update the risk maps when information on protection measures is available was carried out for the city of Winnipeg, Canada. The proposed framework is a promising approach to identify and prioritize flood-prone areas, which are in need of intervention or detailed studies.

  4. Kinematic evaluation of virtual walking trajectories.

    PubMed

    Cirio, Gabriel; Olivier, Anne-Hélène; Marchal, Maud; Pettré, Julien

    2013-04-01

    Virtual walking, a fundamental task in Virtual Reality (VR), is greatly influenced by the locomotion interface being used, by the specificities of input and output devices, and by the way the virtual environment is represented. No matter how virtual walking is controlled, the generation of realistic virtual trajectories is absolutely required for some applications, especially those dedicated to the study of walking behaviors in VR, navigation through virtual places for architecture, rehabilitation and training. Previous studies focused on evaluating the realism of locomotion trajectories have mostly considered the result of the locomotion task (efficiency, accuracy) and its subjective perception (presence, cybersickness). Few focused on the locomotion trajectory itself, but in situation of geometrically constrained task. In this paper, we study the realism of unconstrained trajectories produced during virtual walking by addressing the following question: did the user reach his destination by virtually walking along a trajectory he would have followed in similar real conditions? To this end, we propose a comprehensive evaluation framework consisting on a set of trajectographical criteria and a locomotion model to generate reference trajectories. We consider a simple locomotion task where users walk between two oriented points in space. The travel path is analyzed both geometrically and temporally in comparison to simulated reference trajectories. In addition, we demonstrate the framework over a user study which considered an initial set of common and frequent virtual walking conditions, namely different input devices, output display devices, control laws, and visualization modalities. The study provides insight into the relative contributions of each condition to the overall realism of the resulting virtual trajectories.

  5. Optimal Wastewater Loading under Conflicting Goals and Technology Limitations in a Riverine System.

    PubMed

    Rafiee, Mojtaba; Lyon, Steve W; Zahraie, Banafsheh; Destouni, Georgia; Jaafarzadeh, Nemat

    2017-03-01

      This paper investigates a novel simulation-optimization (S-O) framework for identifying optimal treatment levels and treatment processes for multiple wastewater dischargers to rivers. A commonly used water quality simulation model, Qual2K, was linked to a Genetic Algorithm optimization model for exploration of relevant fuzzy objective-function formulations for addressing imprecision and conflicting goals of pollution control agencies and various dischargers. Results showed a dynamic flow dependence of optimal wastewater loading with good convergence to near global optimum. Explicit considerations of real-world technological limitations, which were developed here in a new S-O framework, led to better compromise solutions between conflicting goals than those identified within traditional S-O frameworks. The newly developed framework, in addition to being more technologically realistic, is also less complicated and converges on solutions more rapidly than traditional frameworks. This technique marks a significant step forward for development of holistic, riverscape-based approaches that balance the conflicting needs of the stakeholders.

  6. Using virtual reality to assess user experience.

    PubMed

    Rebelo, Francisco; Noriega, Paulo; Duarte, Emília; Soares, Marcelo

    2012-12-01

    The aim of this article is to discuss how user experience (UX) evaluation can benefit from the use of virtual reality (VR). UX is usually evaluated in laboratory settings. However, considering that UX occurs as a consequence of the interaction between the product, the user, and the context of use, the assessment of UX can benefit from a more ecological test setting. VR provides the means to develop realistic-looking virtual environments with the advantage of allowing greater control of the experimental conditions while granting good ecological validity. The methods used to evaluate UX, as well as their main limitations, are identified.The currentVR equipment and its potential applications (as well as its limitations and drawbacks) to overcome some of the limitations in the assessment of UX are highlighted. The relevance of VR for UX studies is discussed, and a VR-based framework for evaluating UX is presented. UX research may benefit from a VR-based methodology in the scopes of user research (e.g., assessment of users' expectations derived from their lifestyles) and human-product interaction (e.g., assessment of users' emotions since the first moment of contact with the product and then during the interaction). This article provides knowledge to researchers and professionals engaged in the design of technological interfaces about the usefulness of VR in the evaluation of UX.

  7. Transforming the patient care environment with Lean Six Sigma and realistic evaluation.

    PubMed

    Black, Jason

    2009-01-01

    Lean Six Sigma (LSS) is a structured methodology for transforming processes, but it does not fully consider the complex social interactions that cause processes to form in hospital organizations. By combining LSS implementations with the concept of Realistic Evaluation, a methodology that promotes change by assessing and considering the individual characteristics of an organization's social environment, successful and sustainable process improvement is more likely.

  8. Spatiotemporal Visualization of Time-Series Satellite-Derived CO2 Flux Data Using Volume Rendering and Gpu-Based Interpolation on a Cloud-Driven Digital Earth

    NASA Astrophysics Data System (ADS)

    Wu, S.; Yan, Y.; Du, Z.; Zhang, F.; Liu, R.

    2017-10-01

    The ocean carbon cycle has a significant influence on global climate, and is commonly evaluated using time-series satellite-derived CO2 flux data. Location-aware and globe-based visualization is an important technique for analyzing and presenting the evolution of climate change. To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, a cloud-driven digital earth platform is developed to support the interactive analysis and display of multi-geospatial data, and an original visualization method based on our digital earth is proposed to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to dynamically display the released or absorbed CO2 gas. To enable location-aware visualization within the virtual globe, we present a 3D particlemapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA during real-time rendering is designed to obtain smooth effects in both spatial and temporal dimensions. To demonstrate the capabilities of the proposed method, a series of satellite data is applied to simulate the air-sea carbon cycle in the China Sea. The results show that the suggested strategies provide realistic simulation effects and acceptable interactive performance on the digital earth.

  9. Optimizing performance of hybrid FSO/RF networks in realistic dynamic scenarios

    NASA Astrophysics Data System (ADS)

    Llorca, Jaime; Desai, Aniket; Baskaran, Eswaran; Milner, Stuart; Davis, Christopher

    2005-08-01

    Hybrid Free Space Optical (FSO) and Radio Frequency (RF) networks promise highly available wireless broadband connectivity and quality of service (QoS), particularly suitable for emerging network applications involving extremely high data rate transmissions such as high quality video-on-demand and real-time surveillance. FSO links are prone to atmospheric obscuration (fog, clouds, snow, etc) and are difficult to align over long distances due the use of narrow laser beams and the effect of atmospheric turbulence. These problems can be mitigated by using adjunct directional RF links, which provide backup connectivity. In this paper, methodologies for modeling and simulation of hybrid FSO/RF networks are described. Individual link propagation models are derived using scattering theory, as well as experimental measurements. MATLAB is used to generate realistic atmospheric obscuration scenarios, including moving cloud layers at different altitudes. These scenarios are then imported into a network simulator (OPNET) to emulate mobile hybrid FSO/RF networks. This framework allows accurate analysis of the effects of node mobility, atmospheric obscuration and traffic demands on network performance, and precise evaluation of topology reconfiguration algorithms as they react to dynamic changes in the network. Results show how topology reconfiguration algorithms, together with enhancements to TCP/IP protocols which reduce the network response time, enable the network to rapidly detect and act upon link state changes in highly dynamic environments, ensuring optimized network performance and availability.

  10. A computational model for epidural electrical stimulation of spinal sensorimotor circuits.

    PubMed

    Capogrosso, Marco; Wenger, Nikolaus; Raspopovic, Stanisa; Musienko, Pavel; Beauparlant, Janine; Bassi Luciani, Lorenzo; Courtine, Grégoire; Micera, Silvestro

    2013-12-04

    Epidural electrical stimulation (EES) of lumbosacral segments can restore a range of movements after spinal cord injury. However, the mechanisms and neural structures through which EES facilitates movement execution remain unclear. Here, we designed a computational model and performed in vivo experiments to investigate the type of fibers, neurons, and circuits recruited in response to EES. We first developed a realistic finite element computer model of rat lumbosacral segments to identify the currents generated by EES. To evaluate the impact of these currents on sensorimotor circuits, we coupled this model with an anatomically realistic axon-cable model of motoneurons, interneurons, and myelinated afferent fibers for antagonistic ankle muscles. Comparisons between computer simulations and experiments revealed the ability of the model to predict EES-evoked motor responses over multiple intensities and locations. Analysis of the recruited neural structures revealed the lack of direct influence of EES on motoneurons and interneurons. Simulations and pharmacological experiments demonstrated that EES engages spinal circuits trans-synaptically through the recruitment of myelinated afferent fibers. The model also predicted the capacity of spatially distinct EES to modulate side-specific limb movements and, to a lesser extent, extension versus flexion. These predictions were confirmed during standing and walking enabled by EES in spinal rats. These combined results provide a mechanistic framework for the design of spinal neuroprosthetic systems to improve standing and walking after neurological disorders.

  11. A proposed ethical framework for vaccine mandates: competing values and the case of HPV.

    PubMed

    Field, Robert I; Caplan, Arthur L

    2008-06-01

    Debates over vaccine mandates raise intense emotions, as reflected in the current controversy over whether to mandate the vaccine against human papilloma virus (HPV), the virus that can cause cervical cancer. Public health ethics so far has failed to facilitate meaningful dialogue between the opposing sides. When stripped of its emotional charge, the debate can be framed as a contest between competing ethical values. This framework can be conceptualized graphically as a conflict between autonomy on the one hand, which militates against government intrusion, and beneficence, utilitarianism, justice, and nonmaleficence on the other, which may lend support to intervention. When applied to the HPV vaccine, this framework would support a mandate based on utilitarianism, if certain conditions are met and if herd immunity is a realistic objective.

  12. The Nature of a Literacy-Based Tutoring Program for At-Risk Youth: Mentorship, Professional Development, and Implementation

    ERIC Educational Resources Information Center

    Lopez-Guerra, Maria Asusena

    2013-01-01

    The purpose of this research was to gain and provide an in-depth, holistic description and interpretation of the knowledge and literacy instruction tutors at Readers Advance provide students. Guided by a post-positivist realist framework and grounded theory methodology, qualitative inquiry design strategies were used to guide this research. This…

  13. Development of Turbulent Biological Closure Parameterizations

    DTIC Science & Technology

    2011-09-30

    LONG-TERM GOAL: The long-term goals of this project are: (1) to develop a theoretical framework to quantify turbulence induced NPZ interactions. (2) to apply the theory to develop parameterizations to be used in realistic environmental physical biological coupling numerical models. OBJECTIVES: Connect the Goodman and Robinson (2008) statistically based pdf theory to Advection Diffusion Reaction (ADR) modeling of NPZ interaction.

  14. The Educational Value of Visual Cues and 3D-Representational Format in a Computer Animation under Restricted and Realistic Conditions

    ERIC Educational Resources Information Center

    Huk, Thomas; Steinke, Mattias; Floto, Christian

    2010-01-01

    Within the framework of cognitive learning theories, instructional design manipulations have primarily been investigated under tightly controlled laboratory conditions. We carried out two experiments, where the first experiment was conducted in a restricted system-paced setting and is therefore in line with the majority of empirical studies in the…

  15. Recognition of Prior Learning: The Tensions between Its Inclusive Intentions and Constraints on Its Implementation

    ERIC Educational Resources Information Center

    Cooper, Linda; Ralphs, Alan; Harris, Judy

    2017-01-01

    This article provides some insight into the constraints on the potential of recognition of prior learning (RPL) to widen access to educational qualifications. Its focus is on a conceptual framework that emerged from a South African study of RPL practices across four different learning contexts. Working from a social realist perspective, it argues…

  16. Cultural Implications of a Global Context: The Need for the Reference Librarian To Ask Again "Who Is My Client?".

    ERIC Educational Resources Information Center

    McSwiney, Carolyn

    Globalization provides the contextual framework for cultural changes in the library user group. In order to be more effective, and realistically, more client-focused, the reference librarian is challenged to ask again "Who is my client?" in this changing context. This paper presents a positive and practical response to cultural change…

  17. A generic framework for individual-based modelling and physical-biological interaction

    PubMed Central

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280

  18. Modelling and analysis of the sugar cataract development process using stochastic hybrid systems.

    PubMed

    Riley, D; Koutsoukos, X; Riley, K

    2009-05-01

    Modelling and analysis of biochemical systems such as sugar cataract development (SCD) are critical because they can provide new insights into systems, which cannot be easily tested with experiments; however, they are challenging problems due to the highly coupled chemical reactions that are involved. The authors present a stochastic hybrid system (SHS) framework for modelling biochemical systems and demonstrate the approach for the SCD process. A novel feature of the framework is that it allows modelling the effect of drug treatment on the system dynamics. The authors validate the three sugar cataract models by comparing trajectories computed by two simulation algorithms. Further, the authors present a probabilistic verification method for computing the probability of sugar cataract formation for different chemical concentrations using safety and reachability analysis methods for SHSs. The verification method employs dynamic programming based on a discretisation of the state space and therefore suffers from the curse of dimensionality. To analyse the SCD process, a parallel dynamic programming implementation that can handle large, realistic systems was developed. Although scalability is a limiting factor, this work demonstrates that the proposed method is feasible for realistic biochemical systems.

  19. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

    PubMed Central

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  20. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

  1. A model for seasonal changes in GPS positions and seismic wave speeds due to thermoelastic and hydrologic variations

    USGS Publications Warehouse

    Tsai, V.C.

    2011-01-01

    It is known that GPS time series contain a seasonal variation that is not due to tectonic motions, and it has recently been shown that crustal seismic velocities may also vary seasonally. In order to explain these changes, a number of hypotheses have been given, among which thermoelastic and hydrology-induced stresses and strains are leading candidates. Unfortunately, though, since a general framework does not exist for understanding such seasonal variations, it is currently not possible to quickly evaluate the plausibility of these hypotheses. To fill this gap in the literature, I generalize a two-dimensional thermoelastic strain model to provide an analytic solution for the displacements and wave speed changes due to either thermoelastic stresses or hydrologic loading, which consists of poroelastic stresses and purely elastic stresses. The thermoelastic model assumes a periodic surface temperature, and the hydrologic models similarly assume a periodic near-surface water load. Since all three models are two-dimensional and periodic, they are expected to only approximate any realistic scenario; but the models nonetheless provide a quantitative framework for estimating the effects of thermoelastic and hydrologic variations. Quantitative comparison between the models and observations is further complicated by the large uncertainty in some of the relevant parameters. Despite this uncertainty, though, I find that maximum realistic thermoelastic effects are unlikely to explain a large fraction of the observed annual variation in a typical GPS displacement time series or of the observed annual variations in seismic wave speeds in southern California. Hydrologic loading, on the other hand, may be able to explain a larger fraction of both the annual variations in displacements and seismic wave speeds. Neither model is likely to explain all of the seismic wave speed variations inferred from observations. However, more definitive conclusions cannot be made until the model parameters are better constrained. Copyright ?? 2011 by the American Geophysical Union.

  2. Nonlinear characterization of a bolted, industrial structure using a modal framework

    NASA Astrophysics Data System (ADS)

    Roettgen, Daniel R.; Allen, Matthew S.

    2017-02-01

    This article presents measurements from a sub assembly of an off-the-shelf automotive exhaust system containing a bolted-flange connection and uses a recently proposed modal framework to develop a nonlinear dynamic model for the structure. The nonlinear identification and characterization methods used are reviewed to highlight the strengths of the current approach and the areas where further development is needed. This marks the first use of these new testing and nonlinear identification tools, and the associated modal framework, on production hardware with a realistic joint and realistic torque levels. To screen the measurements for nonlinearities, we make use of a time frequency analysis routine designed for transient responses called the zeroed early-time fast Fourier transform (ZEFFT). This tool typically reveals the small frequency shifts and distortions that tend to occur near each mode that is affected by the nonlinearity. The damping in this structure is found to be significantly nonlinear and a Hilbert transform is used to characterize the damping versus amplitude behavior. A model is presented that captures these effects for each mode individually (e.g. assuming negligible nonlinear coupling between modes), treating each mode as a single degree-of-freedom oscillator with a spring and viscous damping element in parallel with a four parameter Iwan model. The parameters of this model are identified for each of the structure's modes that exhibited nonlinearity and the resulting nonlinear model is shown to capture the stiffness and damping accurately over a large range of response amplitudes.

  3. What are the assets and weaknesses of HFO detectors? A benchmark framework based on realistic simulations

    PubMed Central

    Pizzo, Francesca; Bartolomei, Fabrice; Wendling, Fabrice; Bénar, Christian-George

    2017-01-01

    High-frequency oscillations (HFO) have been suggested as biomarkers of epileptic tissues. While visual marking of these short and small oscillations is tedious and time-consuming, automatic HFO detectors have not yet met a large consensus. Even though detectors have been shown to perform well when validated against visual marking, the large number of false detections due to their lack of robustness hinder their clinical application. In this study, we developed a validation framework based on realistic and controlled simulations to quantify precisely the assets and weaknesses of current detectors. We constructed a dictionary of synthesized elements—HFOs and epileptic spikes—from different patients and brain areas by extracting these elements from the original data using discrete wavelet transform coefficients. These elements were then added to their corresponding simulated background activity (preserving patient- and region- specific spectra). We tested five existing detectors against this benchmark. Compared to other studies confronting detectors, we did not only ranked them according their performance but we investigated the reasons leading to these results. Our simulations, thanks to their realism and their variability, enabled us to highlight unreported issues of current detectors: (1) the lack of robust estimation of the background activity, (2) the underestimated impact of the 1/f spectrum, and (3) the inadequate criteria defining an HFO. We believe that our benchmark framework could be a valuable tool to translate HFOs into a clinical environment. PMID:28406919

  4. High Resolution Visualization Applied to Future Heavy Airlift Concept Development and Evaluation

    NASA Technical Reports Server (NTRS)

    FordCook, A. B.; King, T.

    2012-01-01

    This paper explores the use of high resolution 3D visualization tools for exploring the feasibility and advantages of future military cargo airlift concepts and evaluating compatibility with existing and future payload requirements. Realistic 3D graphic representations of future airlifters are immersed in rich, supporting environments to demonstrate concepts of operations to key personnel for evaluation, feedback, and development of critical joint support. Accurate concept visualizations are reviewed by commanders, platform developers, loadmasters, soldiers, scientists, engineers, and key principal decision makers at various stages of development. The insight gained through the review of these physically and operationally realistic visualizations is essential to refining design concepts to meet competing requirements in a fiscally conservative defense finance environment. In addition, highly accurate 3D geometric models of existing and evolving large military vehicles are loaded into existing and proposed aircraft cargo bays. In this virtual aircraft test-loading environment, materiel developers, engineers, managers, and soldiers can realistically evaluate the compatibility of current and next-generation airlifters with proposed cargo.

  5. Safer passenger car front shapes for pedestrians: A computational approach to reduce overall pedestrian injury risk in realistic impact scenarios.

    PubMed

    Li, Guibing; Yang, Jikuang; Simms, Ciaran

    2017-03-01

    Vehicle front shape has a significant influence on pedestrian injuries and the optimal design for overall pedestrian protection remains an elusive goal, especially considering the variability of vehicle-to-pedestrian accident scenarios. Therefore this study aims to develop and evaluate an efficient framework for vehicle front shape optimization for pedestrian protection accounting for the broad range of real world impact scenarios and their distributions in recent accident data. Firstly, a framework for vehicle front shape optimization for pedestrian protection was developed based on coupling of multi-body simulations and a genetic algorithm. This framework was then applied for optimizing passenger car front shape for pedestrian protection, and its predictions were evaluated using accident data and kinematic analyses. The results indicate that the optimization shows a good convergence and predictions of the optimization framework are corroborated when compared to the available accident data, and the optimization framework can distinguish 'good' and 'poor' vehicle front shapes for pedestrian safety. Thus, it is feasible and reliable to use the optimization framework for vehicle front shape optimization for reducing overall pedestrian injury risk. The results also show the importance of considering the broad range of impact scenarios in vehicle front shape optimization. A safe passenger car for overall pedestrian protection should have a wide and flat bumper (covering pedestrians' legs from the lower leg up to the shaft of the upper leg with generally even contacts), a bonnet leading edge height around 750mm, a short bonnet (<800mm) with a shallow or steep angle (either >17° or <12°) and a shallow windscreen (≤30°). Sensitivity studies based on simulations at the population level indicate that the demands for a safe passenger car front shape for head and leg protection are generally consistent, but partially conflict with pelvis protection. In particular, both head and leg injury risk increase with increasing bumper lower height and depth, and decrease with increasing bonnet leading edge height, while pelvis injury risk increases with increasing bonnet leading edge height. However, the effects of bonnet leading edge height and windscreen design on head injury risk are complex and require further analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Role-playing for more realistic technical skills training.

    PubMed

    Nikendei, C; Zeuch, A; Dieckmann, P; Roth, C; Schäfer, S; Völkl, M; Schellberg, D; Herzog, W; Jünger, J

    2005-03-01

    Clinical skills are an important and necessary part of clinical competence. Simulation plays an important role in many fields of medical education. Although role-playing is common in communication training, there are no reports about the use of student role-plays in the training of technical clinical skills. This article describes an educational intervention with analysis of pre- and post-intervention self-selected student survey evaluations. After one term of skills training, a thorough evaluation showed that the skills-lab training did not seem very realistic nor was it very demanding for trainees. To create a more realistic training situation and to enhance students' involvement, case studies and role-plays with defined roles for students (i.e. intern, senior consultant) were introduced into half of the sessions. Results of the evaluation in the second term showed that sessions with role-playing were rated significantly higher than sessions without role-playing.

  7. Evaluating structure selection in the hydrothermal growth of FeS 2 pyrite and marcasite

    DOE PAGES

    Kitchaev, Daniil A.; Ceder, Gerbrand

    2016-12-14

    While the ab initio prediction of the properties of solids and their optimization towards new proposed materials is becoming established, little predictive theory exists as to which metastable materials can be made and how, impeding their experimental realization. Here we propose a quasi-thermodynamic framework for predicting the hydrothermal synthetic accessibility of metastable materials and apply this model to understanding the phase selection between the pyrite and marcasite polymorphs of FeS 2. We demonstrate that phase selection in this system can be explained by the surface stability of the two phases as a function of ambient pH within nano-size regimes relevantmore » to nucleation. This result suggests that a first-principles understanding of nano-size phase stability in realistic synthesis environments can serve to explain or predict the synthetic accessibility of structural polymorphs, providing a guideline to experimental synthesis via efficient computational materials design.« less

  8. Evaluating the Influence of the Client Behavior in Cloud Computing.

    PubMed

    Souza Pardo, Mário Henrique; Centurion, Adriana Molina; Franco Eustáquio, Paulo Sérgio; Carlucci Santana, Regina Helena; Bruschi, Sarita Mazzini; Santana, Marcos José

    2016-01-01

    This paper proposes a novel approach for the implementation of simulation scenarios, providing a client entity for cloud computing systems. The client entity allows the creation of scenarios in which the client behavior has an influence on the simulation, making the results more realistic. The proposed client entity is based on several characteristics that affect the performance of a cloud computing system, including different modes of submission and their behavior when the waiting time between requests (think time) is considered. The proposed characterization of the client enables the sending of either individual requests or group of Web services to scenarios where the workload takes the form of bursts. The client entity is included in the CloudSim, a framework for modelling and simulation of cloud computing. Experimental results show the influence of the client behavior on the performance of the services executed in a cloud computing system.

  9. Evaluating the Influence of the Client Behavior in Cloud Computing

    PubMed Central

    Centurion, Adriana Molina; Franco Eustáquio, Paulo Sérgio; Carlucci Santana, Regina Helena; Bruschi, Sarita Mazzini; Santana, Marcos José

    2016-01-01

    This paper proposes a novel approach for the implementation of simulation scenarios, providing a client entity for cloud computing systems. The client entity allows the creation of scenarios in which the client behavior has an influence on the simulation, making the results more realistic. The proposed client entity is based on several characteristics that affect the performance of a cloud computing system, including different modes of submission and their behavior when the waiting time between requests (think time) is considered. The proposed characterization of the client enables the sending of either individual requests or group of Web services to scenarios where the workload takes the form of bursts. The client entity is included in the CloudSim, a framework for modelling and simulation of cloud computing. Experimental results show the influence of the client behavior on the performance of the services executed in a cloud computing system. PMID:27441559

  10. Complex Geometric Models of Diffusion and Relaxation in Healthy and Damaged White Matter

    PubMed Central

    Farrell, Jonathan A.D.; Smith, Seth A.; Reich, Daniel S.; Calabresi, Peter A.; van Zijl, Peter C.M.

    2010-01-01

    Which aspects of tissue microstructure affect diffusion weighted MRI signals? Prior models, many of which use Monte-Carlo simulations, have focused on relatively simple models of the cellular microenvironment and have not considered important anatomic details. With the advent of higher-order analysis models for diffusion imaging, such as high-angular-resolution diffusion imaging (HARDI), more realistic models are necessary. This paper presents and evaluates the reproducibility of simulations of diffusion in complex geometries. Our framework is quantitative, does not require specialized hardware, is easily implemented with little programming experience, and is freely available as open-source software. Models may include compartments with different diffusivities, permeabilities, and T2 time constants using both parametric (e.g., spheres and cylinders) and arbitrary (e.g., mesh-based) geometries. Three-dimensional diffusion displacement-probability functions are mapped with high reproducibility, and thus can be readily used to assess reproducibility of diffusion-derived contrasts. PMID:19739233

  11. Developing a realistic-prototyping road user cost evaluation tool for FDOT.

    DOT National Transportation Integrated Search

    2008-12-31

    The objective of this project is to develop a realistic-prototyping RUC (Road User Cost) calculation tool that is userfriendly : and utilizing limited number of data inputs that are easy to use. The tool can help engineers to estimate RUC on : specif...

  12. Epidemiology and causation: a realist view.

    PubMed Central

    Renton, A

    1994-01-01

    In this paper the controversy over how to decide whether associations between factors and diseases are causal is placed within a description of the public health and scientific relevance of epidemiology. It is argued that the rise in popularity of the Popperian view of science, together with a perception of the aims of epidemiology as being to identify appropriate public health interventions, have focussed this debate on unresolved questions of inferential logic, leaving largely unanalysed the notions of causation and of disease at the ontological level. A realist ontology of causation of disease and pathogenesis is constructed within the framework of "scientific materialism", and is shown to provide a coherent basis from which to decide causes and to deal with problems of confounding and interaction in epidemiological research. It is argued that a realist analysis identifies a richer role for epidemiology as an integral part of an ontologically unified medical science. It is this unified medical science as a whole rather than epidemiological observation or experiment which decides causes and, in turn, provides a key element to the foundations of rational public health decision making. PMID:8138775

  13. Towards an ontology for data quality in integrated chronic disease management: a realist review of the literature.

    PubMed

    Liaw, S T; Rahimi, A; Ray, P; Taggart, J; Dennis, S; de Lusignan, S; Jalaludin, B; Yeo, A E T; Talaei-Khoei, A

    2013-01-01

    Effective use of routine data to support integrated chronic disease management (CDM) and population health is dependent on underlying data quality (DQ) and, for cross system use of data, semantic interoperability. An ontological approach to DQ is a potential solution but research in this area is limited and fragmented. Identify mechanisms, including ontologies, to manage DQ in integrated CDM and whether improved DQ will better measure health outcomes. A realist review of English language studies (January 2001-March 2011) which addressed data quality, used ontology-based approaches and is relevant to CDM. We screened 245 papers, excluded 26 duplicates, 135 on abstract review and 31 on full-text review; leaving 61 papers for critical appraisal. Of the 33 papers that examined ontologies in chronic disease management, 13 defined data quality and 15 used ontologies for DQ. Most saw DQ as a multidimensional construct, the most used dimensions being completeness, accuracy, correctness, consistency and timeliness. The majority of studies reported tool design and development (80%), implementation (23%), and descriptive evaluations (15%). Ontological approaches were used to address semantic interoperability, decision support, flexibility of information management and integration/linkage, and complexity of information models. DQ lacks a consensus conceptual framework and definition. DQ and ontological research is relatively immature with little rigorous evaluation studies published. Ontology-based applications could support automated processes to address DQ and semantic interoperability in repositories of routinely collected data to deliver integrated CDM. We advocate moving to ontology-based design of information systems to enable more reliable use of routine data to measure health mechanisms and impacts. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Characterization of photomultiplier tubes with a realistic model through GPU-boosted simulation

    NASA Astrophysics Data System (ADS)

    Anthony, M.; Aprile, E.; Grandi, L.; Lin, Q.; Saldanha, R.

    2018-02-01

    The accurate characterization of a photomultiplier tube (PMT) is crucial in a wide-variety of applications. However, current methods do not give fully accurate representations of the response of a PMT, especially at very low light levels. In this work, we present a new and more realistic model of the response of a PMT, called the cascade model, and use it to characterize two different PMTs at various voltages and light levels. The cascade model is shown to outperform the more common Gaussian model in almost all circumstances and to agree well with a newly introduced model independent approach. The technical and computational challenges of this model are also presented along with the employed solution of developing a robust GPU-based analysis framework for this and other non-analytical models.

  15. Protocol for fermionic positive-operator-valued measures

    NASA Astrophysics Data System (ADS)

    Arvidsson-Shukur, D. R. M.; Lepage, H. V.; Owen, E. T.; Ferrus, T.; Barnes, C. H. W.

    2017-11-01

    In this paper we present a protocol for the implementation of a positive-operator-valued measure (POVM) on massive fermionic qubits. We present methods for implementing nondispersive qubit transport, spin rotations, and spin polarizing beam-splitter operations. Our scheme attains linear opticslike control of the spatial extent of the qubits by considering ground-state electrons trapped in the minima of surface acoustic waves in semiconductor heterostructures. Furthermore, we numerically simulate a high-fidelity POVM that carries out Procrustean entanglement distillation in the framework of our scheme, using experimentally realistic potentials. Our protocol can be applied not only to pure ensembles with particle pairs of known identical entanglement, but also to realistic ensembles of particle pairs with a distribution of entanglement entropies. This paper provides an experimentally realizable design for future quantum technologies.

  16. Accidental symmetries and massless quarks in the economical 3-3-1 model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montero, J. C.; Sánchez–Vega, B. L.

    In the framework of a 3-3-1 model with a minimal scalar sector, known as the economical 3-3-1 model, we study its capabilities of generating realistic quark masses. After a detailed study of the symmetries of the model, before and after the spontaneous symmetry breaking, we find a remaining axial symmetry that prevents some quarks from gaining mass at all orders in perturbation theory. Since this accidental symmetry is anomalous, we also consider briefly the possibility of generating their masses for nonperturbative effects. However, we find that nonperturbative effects are not enough to generate the measured masses for the three masslessmore » quarks. Hence, these results imply that the economical 3-3-1 model is not a realistic description of the electroweak interaction.« less

  17. Equilibrium pricing in an order book environment: Case study for a spin model

    NASA Astrophysics Data System (ADS)

    Meudt, Frederik; Schmitt, Thilo A.; Schäfer, Rudi; Guhr, Thomas

    2016-07-01

    When modeling stock market dynamics, the price formation is often based on an equilibrium mechanism. In real stock exchanges, however, the price formation is governed by the order book. It is thus interesting to check if the resulting stylized facts of a model with equilibrium pricing change, remain the same or, more generally, are compatible with the order book environment. We tackle this issue in the framework of a case study by embedding the Bornholdt-Kaizoji-Fujiwara spin model into the order book dynamics. To this end, we use a recently developed agent based model that realistically incorporates the order book. We find realistic stylized facts. We conclude for the studied case that equilibrium pricing is not needed and that the corresponding assumption of a ;fundamental; price may be abandoned.

  18. The Stopit! programme to reduce bullying and undermining behaviour in hospitals.

    PubMed

    Benmore, Graham; Henderson, Steven; Mountfield, Joanna; Wink, Brian

    2018-05-21

    Purpose The impact of bullying and undermining behaviours on the National Health Service on costs, patient safety and retention of staff was well understood even before the Illing report, published in 2013, that reviewed the efficacy of training interventions designed to reduce bullying and harassment in the outputs. The purpose of this paper is to provide an example of a good programme well evaluated. Design/methodology/approach The methodology follows a broad realist approach, by specifying the underlying programme assumptions and intention of the designers. Three months after the event, Q-sort methodology was employed to group participants into one of three contexts - mechanism - output groups. Interviews were then undertaken with members of two of these groups, to evaluate how the programme had influenced each. Findings Q-sort identified a typology of three beneficiaries from the Stopit! workshops, characterised as professionals, colleagues and victims. Each group had acted upon different parts of the programme, depending chiefly upon their current and past experiences of bullying in hospitals. Research limitations/implications The paper demonstrates the effectiveness of using Q-sort method to identify relevant CMOs in a realist evaluation framework. Practical implications The paper considers the effectiveness of the programme to reduce bullying, rather than teach victims to cope, and how it may be strengthened based upon the research findings and Illing recommendations. Social implications Workplace bullying is invariably implicated in scandals concerning poor hospital practice, poor patient outcomes and staff illness. All too frequently, the sector responds by offering training in resilience, which though helpful, places the onus on the victim to cope rather than the employer to reduce or eliminate the practice. This paper documents and evaluates an attempt to change workplace practices to directly address bullying and undermining. Originality/value The paper describes a new programme broadly consistent with Illing report endorsements. Second, it illustrates a novel evaluation method that highlights rigorously the contexts, mechanisms and outcomes at the pilot stage of an intervention identifies contexts and mechanisms via factor analysis using Q-sort methodology.

  19. Development and validation of an artificial wetlab training system for the lumbar discectomy.

    PubMed

    Adermann, Jens; Geissler, Norman; Bernal, Luis E; Kotzsch, Susanne; Korb, Werner

    2014-09-01

    An initial research indicated that realistic haptic simulators with an adapted training concept are needed to enhance the training for spinal surgery. A cognitive task analysis (CTA) was performed to define a realistic and helpful scenario-based simulation. Based on the results a simulator for lumbar discectomy was developed. Additionally, a realistic training operating room was built for a pilot. The results were validated. The CTA showed a need for realistic scenario-based training in spine surgery. The developed simulator consists of synthetic bone structures, synthetic soft tissue and an advanced bleeding system. Due to the close interdisciplinary cooperation of surgeons between engineers and psychologists, the iterative multicentre validation showed that the simulator is visually and haptically realistic. The simulator offers integrated sensors for the evaluation of the traction being used and the compression during surgery. The participating surgeons in the pilot workshop rated the simulator and the training concept as very useful for the improvement of their surgical skills. In the context of the present work a precise definition for the simulator and training concept was developed. The additional implementation of sensors allows the objective evaluation of the surgical training by the trainer. Compared to other training simulators and concepts, the high degree of objectivity strengthens the acceptance of the feedback. The measured data of the nerve root tension and the compression of the dura can be used for intraoperative control and a detailed postoperative evaluation.

  20. Automatic Texture Reconstruction of 3d City Model from Oblique Images

    NASA Astrophysics Data System (ADS)

    Kang, Junhua; Deng, Fei; Li, Xinwei; Wan, Fang

    2016-06-01

    In recent years, the photorealistic 3D city models are increasingly important in various geospatial applications related to virtual city tourism, 3D GIS, urban planning, real-estate management. Besides the acquisition of high-precision 3D geometric data, texture reconstruction is also a crucial step for generating high-quality and visually realistic 3D models. However, most of the texture reconstruction approaches are probably leading to texture fragmentation and memory inefficiency. In this paper, we introduce an automatic framework of texture reconstruction to generate textures from oblique images for photorealistic visualization. Our approach include three major steps as follows: mesh parameterization, texture atlas generation and texture blending. Firstly, mesh parameterization procedure referring to mesh segmentation and mesh unfolding is performed to reduce geometric distortion in the process of mapping 2D texture to 3D model. Secondly, in the texture atlas generation step, the texture of each segmented region in texture domain is reconstructed from all visible images with exterior orientation and interior orientation parameters. Thirdly, to avoid color discontinuities at boundaries between texture regions, the final texture map is generated by blending texture maps from several corresponding images. We evaluated our texture reconstruction framework on a dataset of a city. The resulting mesh model can get textured by created texture without resampling. Experiment results show that our method can effectively mitigate the occurrence of texture fragmentation. It is demonstrated that the proposed framework is effective and useful for automatic texture reconstruction of 3D city model.

  1. A Framework for Effective Assessment of Model-based Projections of Biodiversity to Inform the Next Generation of Global Conservation Targets

    NASA Astrophysics Data System (ADS)

    Myers, B.; Beard, T. D.; Weiskopf, S. R.; Jackson, S. T.; Tittensor, D.; Harfoot, M.; Senay, G. B.; Casey, K.; Lenton, T. M.; Leidner, A. K.; Ruane, A. C.; Ferrier, S.; Serbin, S.; Matsuda, H.; Shiklomanov, A. N.; Rosa, I.

    2017-12-01

    Biodiversity and ecosystems services underpin political targets for the conservation of biodiversity; however, previous incarnations of these biodiversity-related targets have not relied on integrated model based projections of possible outcomes based on climate and land use change. Although a few global biodiversity models are available, most biodiversity models lie along a continuum of geography and components of biodiversity. Model-based projections of the future of global biodiversity are critical to support policymakers in the development of informed global conservation targets, but the scientific community lacks a clear strategy for integrating diverse data streams in developing, and evaluating the performance of, such biodiversity models. Therefore, in this paper, we propose a framework for ongoing testing and refinement of model-based projections of biodiversity trends and change, by linking a broad variety of biodiversity models with data streams generated by advances in remote sensing, coupled with new and emerging in-situ observation technologies to inform development of essential biodiversity variables, future global biodiversity targets, and indicators. Our two main objectives are to (1) develop a framework for model testing and refining projections of a broad range of biodiversity models, focusing on global models, through the integration of diverse data streams and (2) identify the realistic outputs that can be developed and determine coupled approaches using remote sensing and new and emerging in-situ observations (e.g., metagenomics) to better inform the next generation of global biodiversity targets.

  2. Robust Real-Time Musculoskeletal Modeling Driven by Electromyograms.

    PubMed

    Durandau, Guillaume; Farina, Dario; Sartori, Massimo

    2018-03-01

    Current clinical biomechanics involves lengthy data acquisition and time-consuming offline analyses with biomechanical models not operating in real-time for man-machine interfacing. We developed a method that enables online analysis of neuromusculoskeletal function in vivo in the intact human. We used electromyography (EMG)-driven musculoskeletal modeling to simulate all transformations from muscle excitation onset (EMGs) to mechanical moment production around multiple lower-limb degrees of freedom (DOFs). We developed a calibration algorithm that enables adjusting musculoskeletal model parameters specifically to an individual's anthropometry and force-generating capacity. We incorporated the modeling paradigm into a computationally efficient, generic framework that can be interfaced in real-time with any movement data collection system. The framework demonstrated the ability of computing forces in 13 lower-limb muscle-tendon units and resulting moments about three joint DOFs simultaneously in real-time. Remarkably, it was capable of extrapolating beyond calibration conditions, i.e., predicting accurate joint moments during six unseen tasks and one unseen DOF. The proposed framework can dramatically reduce evaluation latency in current clinical biomechanics and open up new avenues for establishing prompt and personalized treatments, as well as for establishing natural interfaces between patients and rehabilitation systems. The integration of EMG with numerical modeling will enable simulating realistic neuromuscular strategies in conditions including muscular/orthopedic deficit, which could not be robustly simulated via pure modeling formulations. This will enable translation to clinical settings and development of healthcare technologies including real-time bio-feedback of internal mechanical forces and direct patient-machine interfacing.

  3. Strengthening primary health care in low- and middle-income countries: generating evidence through evaluation.

    PubMed

    Rule, John; Ngo, Duc Anh; Oanh, Tran Thi Mai; Asante, Augustine; Doyle, Jennifer; Roberts, Graham; Taylor, Richard

    2014-07-01

    Since the publication of the World Health Report 2008, there has been renewed interest in the potential of primary health care (PHC) to deliver global health policy agendas. The WHO Western Pacific Regional Strategy 2010 states that health systems in low- and middle-income countries (LMICs) can be strengthened using PHC values as core principles. This review article explores the development of an evidence-based approach for assessing the effectiveness of PHC programs and interventions in LMICs. A realist review method was used to investigate whether there is any internationally consistent approach to evaluating PHC. Studies from LMICs using an explicit methodology or framework for measuring PHC effectiveness were collated. Databases of published articles were searched, and a review of gray literature was undertaken to identify relevant reports. The review found no consistent approach for assessing the effectiveness of PHC interventions in LMICs. An innovative approach used in China, which developed a set of core community health facility indicators based on stakeholder input, does show some potential for use in other LMIC contexts. © 2013 APJPH.

  4. Live Speech Driven Head-and-Eye Motion Generators.

    PubMed

    Le, Binh H; Ma, Xiaohan; Deng, Zhigang

    2012-11-01

    This paper describes a fully automated framework to generate realistic head motion, eye gaze, and eyelid motion simultaneously based on live (or recorded) speech input. Its central idea is to learn separate yet interrelated statistical models for each component (head motion, gaze, or eyelid motion) from a prerecorded facial motion data set: 1) Gaussian Mixture Models and gradient descent optimization algorithm are employed to generate head motion from speech features; 2) Nonlinear Dynamic Canonical Correlation Analysis model is used to synthesize eye gaze from head motion and speech features, and 3) nonnegative linear regression is used to model voluntary eye lid motion and log-normal distribution is used to describe involuntary eye blinks. Several user studies are conducted to evaluate the effectiveness of the proposed speech-driven head and eye motion generator using the well-established paired comparison methodology. Our evaluation results clearly show that this approach can significantly outperform the state-of-the-art head and eye motion generation algorithms. In addition, a novel mocap+video hybrid data acquisition technique is introduced to record high-fidelity head movement, eye gaze, and eyelid motion simultaneously.

  5. Development of fine-resolution analyses and expanded large-scale forcing properties. Part I: Methodology and evaluation

    DOE PAGES

    Li, Zhijin; Vogelmann, Andrew M.; Feng, Sha; ...

    2015-01-20

    We produce fine-resolution, three-dimensional fields of meteorological and other variables for the U.S. Department of Energy’s Atmospheric Radiation Measurement (ARM) Southern Great Plains site. The Community Gridpoint Statistical Interpolation system is implemented in a multiscale data assimilation (MS-DA) framework that is used within the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. The MS-DA algorithm uses existing reanalysis products and constrains fine-scale atmospheric properties by assimilating high-resolution observations. A set of experiments show that the data assimilation analysis realistically reproduces the intensity, structure, and time evolution of clouds and precipitation associated with a mesoscale convective system.more » Evaluations also show that the large-scale forcing derived from the fine-resolution analysis has an overall accuracy comparable to the existing ARM operational product. For enhanced applications, the fine-resolution fields are used to characterize the contribution of subgrid variability to the large-scale forcing and to derive hydrometeor forcing, which are presented in companion papers.« less

  6. A Framework for Facility Modification.

    DTIC Science & Technology

    1987-09-01

    effective , Army training must be performance oriented, demanding and realistic. Effective training with today’s complex weapons and combined arms fighting... PERFORMING ORGANIZATION REPORT NUMBER(S) S MONITORING ORGANIZATION REPORT NUMBER(S) 6a NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF...communication is vital to the successful integration of new technologies into existing organizations. 6 0 Timely and effective communication of

  7. The Language Factor in Elementary Mathematics Assessments: Computational Skills and Applied Problem Solving in a Multidimensional IRT Framework

    ERIC Educational Resources Information Center

    Hickendorff, Marian

    2013-01-01

    The results of an exploratory study into measurement of elementary mathematics ability are presented. The focus is on the abilities involved in solving standard computation problems on the one hand and problems presented in a realistic context on the other. The objectives were to assess to what extent these abilities are shared or distinct, and…

  8. Model-based framework for multi-axial real-time hybrid simulation testing

    NASA Astrophysics Data System (ADS)

    Fermandois, Gaston A.; Spencer, Billie F.

    2017-10-01

    Real-time hybrid simulation is an efficient and cost-effective dynamic testing technique for performance evaluation of structural systems subjected to earthquake loading with rate-dependent behavior. A loading assembly with multiple actuators is required to impose realistic boundary conditions on physical specimens. However, such a testing system is expected to exhibit significant dynamic coupling of the actuators and suffer from time lags that are associated with the dynamics of the servo-hydraulic system, as well as control-structure interaction (CSI). One approach to reducing experimental errors considers a multi-input, multi-output (MIMO) controller design, yielding accurate reference tracking and noise rejection. In this paper, a framework for multi-axial real-time hybrid simulation (maRTHS) testing is presented. The methodology employs a real-time feedback-feedforward controller for multiple actuators commanded in Cartesian coordinates. Kinematic transformations between actuator space and Cartesian space are derived for all six-degrees-offreedom of the moving platform. Then, a frequency domain identification technique is used to develop an accurate MIMO transfer function of the system. Further, a Cartesian-domain model-based feedforward-feedback controller is implemented for time lag compensation and to increase the robustness of the reference tracking for given model uncertainty. The framework is implemented using the 1/5th-scale Load and Boundary Condition Box (LBCB) located at the University of Illinois at Urbana- Champaign. To demonstrate the efficacy of the proposed methodology, a single-story frame subjected to earthquake loading is tested. One of the columns in the frame is represented physically in the laboratory as a cantilevered steel column. For realtime execution, the numerical substructure, kinematic transformations, and controllers are implemented on a digital signal processor. Results show excellent performance of the maRTHS framework when six-degrees-of-freedom are controlled at the interface between substructures.

  9. Estimating yield gaps at the cropping system level.

    PubMed

    Guilpart, Nicolas; Grassini, Patricio; Sadras, Victor O; Timsina, Jagadish; Cassman, Kenneth G

    2017-05-01

    Yield gap analyses of individual crops have been used to estimate opportunities for increasing crop production at local to global scales, thus providing information crucial to food security. However, increases in crop production can also be achieved by improving cropping system yield through modification of spatial and temporal arrangement of individual crops. In this paper we define the cropping system yield potential as the output from the combination of crops that gives the highest energy yield per unit of land and time, and the cropping system yield gap as the difference between actual energy yield of an existing cropping system and the cropping system yield potential. Then, we provide a framework to identify alternative cropping systems which can be evaluated against the current ones. A proof-of-concept is provided with irrigated rice-maize systems at four locations in Bangladesh that represent a range of climatic conditions in that country. The proposed framework identified (i) realistic alternative cropping systems at each location, and (ii) two locations where expected improvements in crop production from changes in cropping intensity (number of crops per year) were 43% to 64% higher than from improving the management of individual crops within the current cropping systems. The proposed framework provides a tool to help assess food production capacity of new systems ( e.g. with increased cropping intensity) arising from climate change, and assess resource requirements (water and N) and associated environmental footprint per unit of land and production of these new systems. By expanding yield gap analysis from individual crops to the cropping system level and applying it to new systems, this framework could also be helpful to bridge the gap between yield gap analysis and cropping/farming system design.

  10. Iterative refinement of implicit boundary models for improved geological feature reproduction

    NASA Astrophysics Data System (ADS)

    Martin, Ryan; Boisvert, Jeff B.

    2017-12-01

    Geological domains contain non-stationary features that cannot be described by a single direction of continuity. Non-stationary estimation frameworks generate more realistic curvilinear interpretations of subsurface geometries. A radial basis function (RBF) based implicit modeling framework using domain decomposition is developed that permits introduction of locally varying orientations and magnitudes of anisotropy for boundary models to better account for the local variability of complex geological deposits. The interpolation framework is paired with a method to automatically infer the locally predominant orientations, which results in a rapid and robust iterative non-stationary boundary modeling technique that can refine locally anisotropic geological shapes automatically from the sample data. The method also permits quantification of the volumetric uncertainty associated with the boundary modeling. The methodology is demonstrated on a porphyry dataset and shows improved local geological features.

  11. Towards a Realist Sociology of Education: A Polyphonic Review Essay

    ERIC Educational Resources Information Center

    Grenfell, Michael; Hood, Susan; Barrett, Brian D.; Schubert, Dan

    2017-01-01

    This review essay evaluates Karl Maton's "Knowledge and Knowers: Towards a Realist Sociology of Education" as a recent examination of the sociological causes and effects of education in the tradition of the French social theorist Pierre Bourdieu and the British educational sociologist Basil Bernstein. Maton's book synthesizes the…

  12. Standardized Patients Provide Realistic and Worthwhile Experiences for Athletic Training Students

    ERIC Educational Resources Information Center

    Walker, Stacy E.; Weidner, Thomas G.

    2010-01-01

    Context: Standardized patients are more prominently used to both teach and evaluate students' clinical skills and abilities. Objective: To investigate whether athletic training students perceived an encounter with a standardized patient (SP) as realistic and worthwhile and to determine their perceived comfort in future lower extremity evaluations…

  13. FRED (a Framework for Reconstructing Epidemic Dynamics): an open-source software system for modeling infectious diseases and control strategies using census-based populations.

    PubMed

    Grefenstette, John J; Brown, Shawn T; Rosenfeld, Roni; DePasse, Jay; Stone, Nathan T B; Cooley, Phillip C; Wheaton, William D; Fyshe, Alona; Galloway, David D; Sriram, Anuroop; Guclu, Hasan; Abraham, Thomas; Burke, Donald S

    2013-10-08

    Mathematical and computational models provide valuable tools that help public health planners to evaluate competing health interventions, especially for novel circumstances that cannot be examined through observational or controlled studies, such as pandemic influenza. The spread of diseases like influenza depends on the mixing patterns within the population, and these mixing patterns depend in part on local factors including the spatial distribution and age structure of the population, the distribution of size and composition of households, employment status and commuting patterns of adults, and the size and age structure of schools. Finally, public health planners must take into account the health behavior patterns of the population, patterns that often vary according to socioeconomic factors such as race, household income, and education levels. FRED (a Framework for Reconstructing Epidemic Dynamics) is a freely available open-source agent-based modeling system based closely on models used in previously published studies of pandemic influenza. This version of FRED uses open-access census-based synthetic populations that capture the demographic and geographic heterogeneities of the population, including realistic household, school, and workplace social networks. FRED epidemic models are currently available for every state and county in the United States, and for selected international locations. State and county public health planners can use FRED to explore the effects of possible influenza epidemics in specific geographic regions of interest and to help evaluate the effect of interventions such as vaccination programs and school closure policies. FRED is available under a free open source license in order to contribute to the development of better modeling tools and to encourage open discussion of modeling tools being used to evaluate public health policies. We also welcome participation by other researchers in the further development of FRED.

  14. Realistic Radio Communications in Pilot Simulator Training

    NASA Technical Reports Server (NTRS)

    Burki-Cohen, Judith; Kendra, Andrew J.; Kanki, Barbara G.; Lee, Alfred T.

    2000-01-01

    Simulators used for total training and evaluation of airline pilots must satisfy stringent criteria in order to assure their adequacy for training and checking maneuvers. Air traffic control and company radio communications simulation, however, may still be left to role-play by the already taxed instructor/evaluators in spite of their central importance in every aspect of the flight environment. The underlying premise of this research is that providing a realistic radio communications environment would increase safety by enhancing pilot training and evaluation. This report summarizes the first-year efforts of assessing the requirement and feasibility of simulating radio communications automatically. A review of the training and crew resource/task management literature showed both practical and theoretical support for the need for realistic radio communications simulation. A survey of 29 instructor/evaluators from 14 airlines revealed that radio communications are mainly role-played by the instructor/evaluators. This increases instructor/evaluators' own workload while unrealistically lowering pilot communications load compared to actual operations, with a concomitant loss in training/evaluation effectiveness. A technology review searching for an automated means of providing radio communications to and from aircraft with minimal human effort showed that while promising, the technology is still immature. Further research and the need for establishing a proof-of-concept are also discussed.

  15. The use of benthic indicators in Europe: from the Water Framework Directive to the Marine Strategy Framework Directive.

    PubMed

    Van Hoey, Gert; Borja, Angel; Birchenough, Silvana; Buhl-Mortensen, Lene; Degraer, Steven; Fleischer, Dirk; Kerckhof, Francis; Magni, Paolo; Muxika, Iñigo; Reiss, Henning; Schröder, Alexander; Zettler, Michael L

    2010-12-01

    The Water Framework Directive (WFD) and the Marine Strategy Framework Directive (MSFD) are the European umbrella regulations for water systems. It is a challenge for the scientific community to translate the principles of these directives into realistic and accurate approaches. The aim of this paper, conducted by the Benthos Ecology Working Group of ICES, is to describe how the principles have been translated, which were the challenges and best way forward. We have tackled the following principles: the ecosystem-based approach, the development of benthic indicators, the definition of 'pristine' or sustainable conditions, the detection of pressures and the development of monitoring programs. We concluded that testing and integrating the different approaches was facilitated during the WFD process, which led to further insights and improvements, which the MSFD can rely upon. Expert involvement in the entire implementation process proved to be of vital importance. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Host and adsorbate dynamics in silicates with flexible frameworks: Empirical force field simulation of water in silicalite

    NASA Astrophysics Data System (ADS)

    Bordat, Patrice; Cazade, Pierre-André; Baraille, Isabelle; Brown, Ross

    2010-03-01

    Molecular dynamics simulations are performed on the pure silica zeolite silicalite (MFI framework code), maintaining via a new force field both framework flexibility and realistic account of electrostatic interactions with adsorbed water. The force field is similar to the well-known "BKS" model [B. W. H. van Beest et al., Phys. Rev. Lett. 64, 1955 (1990)], but with reduced partial atomic charges and reoptimized covalent bond potential wells. The present force field reproduces the monoclinic to orthorhombic transition of silicalite. The force field correctly represents the hydrophobicity of pure silica silicalite, both the adsorption energy, and the molecular diffusion constants of water. Two types of adsorption, specific and weak unspecific, are predicted on the channel walls and at the channel intersection. We discuss molecular diffusion of water in silicalite, deducing a barrier to crossing between the straight and the zigzag channels. Analysis of the thermal motion shows that at room temperature, framework oxygen atoms incurring into the zeolite channels significantly influence the dynamics of adsorbed water.

  17. Combined EDL-Mobility Planning for Planetary Missions

    NASA Technical Reports Server (NTRS)

    Kuwata, Yoshiaki; Balaram, Bob

    2011-01-01

    This paper presents an analysis framework for planetary missions that have coupled mobility and EDL (Entry-Descent-Landing) systems. Traditional systems engineering approaches to mobility missions such as MERs (Mars Exploration Rovers) and MSL (Mars Science Laboratory) independently study the EDL system and the mobility system, and does not perform explicit trade-off between them or risk minimization of the overall system. A major challenge is that EDL operation is inherently uncertain and its analysis results such as landing footprint are described using PDF (Probability Density Function). The proposed approach first builds a mobility cost-to-go map that encodes the driving cost of any point on the map to a science target location. The cost could include variety of metrics such as traverse distance, time, wheel rotation on soft soil, and closeness to hazards. It then convolves the mobility cost-to-go map with the landing PDF given by the EDL system, which provides a histogram of driving cost, which can be used to evaluate the overall risk of the mission. By capturing the coupling between EDL and mobility explicitly, this analysis framework enables quantitative tradeoff between EDL and mobility system performance, as well as the characterization of risks in a statistical way. The simulation results are presented with a realistic Mars terrain data

  18. Describing complex cells in primary visual cortex: a comparison of context and multi-filter LN models.

    PubMed

    Westö, Johan; May, Patrick J C

    2018-05-02

    Receptive field (RF) models are an important tool for deciphering neural responses to sensory stimuli. The two currently popular RF models are multi-filter linear-nonlinear (LN) models and context models. Models are, however, never correct and they rely on assumptions to keep them simple enough to be interpretable. As a consequence, different models describe different stimulus-response mappings, which may or may not be good approximations of real neural behavior. In the current study, we take up two tasks: First, we introduce new ways to estimate context models with realistic nonlinearities, that is, with logistic and exponential functions. Second, we evaluate context models and multi-filter LN models in terms of how well they describe recorded data from complex cells in cat primary visual cortex. Our results, based on single-spike information and correlation coefficients, indicate that context models outperform corresponding multi-filter LN models of equal complexity (measured in terms of number of parameters), with the best increase in performance being achieved by the novel context models. Consequently, our results suggest that the multi-filter LN-model framework is suboptimal for describing the behavior of complex cells: the context-model framework is clearly superior while still providing interpretable quantizations of neural behavior.

  19. A pervasive visual-haptic framework for virtual delivery training.

    PubMed

    Abate, Andrea F; Acampora, Giovanni; Loia, Vincenzo; Ricciardi, Stefano; Vasilakos, Athanasios V

    2010-03-01

    Thanks to the advances of voltage regulator (VR) technologies and haptic systems, virtual simulators are increasingly becoming a viable alternative to physical simulators in medicine and surgery, though many challenges still remain. In this study, a pervasive visual-haptic framework aimed to the training of obstetricians and midwives to vaginal delivery is described. The haptic feedback is provided by means of two hand-based haptic devices able to reproduce force-feedbacks on fingers and arms, thus enabling a much more realistic manipulation respect to stylus-based solutions. The interactive simulation is not solely driven by an approximated model of complex forces and physical constraints but, instead, is approached by a formal modeling of the whole labor and of the assistance/intervention procedures performed by means of a timed automata network and applied to a parametrical 3-D model of the anatomy, able to mimic a wide range of configurations. This novel methodology is able to represent not only the sequence of the main events associated to either a spontaneous or to an operative childbirth process, but also to help in validating the manual intervention as the actions performed by the user during the simulation are evaluated according to established medical guidelines. A discussion on the first results as well as on the challenges still unaddressed is included.

  20. On the thermomechanical coupling in dissipative materials: A variational approach for generalized standard materials

    NASA Astrophysics Data System (ADS)

    Bartels, A.; Bartel, T.; Canadija, M.; Mosler, J.

    2015-09-01

    This paper deals with the thermomechanical coupling in dissipative materials. The focus lies on finite strain plasticity theory and the temperature increase resulting from plastic deformation. For this type of problem, two fundamentally different modeling approaches can be found in the literature: (a) models based on thermodynamical considerations and (b) models based on the so-called Taylor-Quinney factor. While a naive straightforward implementation of thermodynamically consistent approaches usually leads to an over-prediction of the temperature increase due to plastic deformation, models relying on the Taylor-Quinney factor often violate fundamental physical principles such as the first and the second law of thermodynamics. In this paper, a thermodynamically consistent framework is elaborated which indeed allows the realistic prediction of the temperature evolution. In contrast to previously proposed frameworks, it is based on a fully three-dimensional, finite strain setting and it naturally covers coupled isotropic and kinematic hardening - also based on non-associative evolution equations. Considering a variationally consistent description based on incremental energy minimization, it is shown that the aforementioned problem (thermodynamical consistency and a realistic temperature prediction) is essentially equivalent to correctly defining the decomposition of the total energy into stored and dissipative parts. Interestingly, this decomposition shows strong analogies to the Taylor-Quinney factor. In this respect, the Taylor-Quinney factor can be well motivated from a physical point of view. Furthermore, certain intervals for this factor can be derived in order to guarantee that fundamental physically principles are fulfilled a priori. Representative examples demonstrate the predictive capabilities of the final constitutive modeling framework.

  1. Status of the MIND simulation and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cervera Villanueva, A.; Martin-Albo, J.; Laing, A.

    2010-03-30

    A realistic simulation of the Neutrino Factory detectors is required in order to fully understand the sensitivity of such a facility to the remaining parameters and degeneracies of the neutrino mixing matrix. Here described is the status of a modular software framework being developed to accommodate such a study. The results of initial studies of the reconstruction software and expected efficiency curves in the context of the golden channel are given.

  2. Explaining Iran’s Foreign Policy, 1979-2009

    DTIC Science & Technology

    2010-12-01

    of reasons. From a strictly realist perspective, Iran’s behavior may seem maddeningly inconsistent. Decisions made by the leadership to prolong...objective of this book is to offer a framework to help U.S. policymakers and analysts better understand existing and evolving leadership dynamics...neoconservatives and liberals sold on Kantian peace. While the realities of the Iraq war may have muted calls for regime change in Iran, recent

  3. A Science of Social Work, and Social Work as an Integrative Scientific Discipline: Have We Gone Too Far, or Not Far Enough?

    ERIC Educational Resources Information Center

    Brekke, John S.

    2014-01-01

    There are two purposes to this article. The first is to update the science of social work framework. The second is to use recent discussions on the nature of realist science and on social work science to propose a definition of social work as an integrative scientific discipline that complements its definition as a profession.

  4. Combining Campbell Standards and the Realist Evaluation Approach: The Best of Two Worlds?

    ERIC Educational Resources Information Center

    van der Knaap, Leontien M.; Leeuw, Frans L.; Bogaerts, Stefan; Nijssen, Laura T. J.

    2008-01-01

    This article presents an approach to systematic reviews that combines the Campbell Collaboration Crime and Justice standards and the realist notion of contexts-mechanisms-outcomes (CMO) configurations. Both approaches have their advantages and drawbacks, and the authors will make a case for combining both approaches to profit from their advantages…

  5. A High-Order Method Using Unstructured Grids for the Aeroacoustic Analysis of Realistic Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    Atkins, Harold L.; Lockard, David P.

    1999-01-01

    A method for the prediction of acoustic scatter from complex geometries is presented. The discontinuous Galerkin method provides a framework for the development of a high-order method using unstructured grids. The method's compact form contributes to its accuracy and efficiency, and makes the method well suited for distributed memory parallel computing platforms. Mesh refinement studies are presented to validate the expected convergence properties of the method, and to establish the absolute levels of a error one can expect at a given level of resolution. For a two-dimensional shear layer instability wave and for three-dimensional wave propagation, the method is demonstrated to be insensitive to mesh smoothness. Simulations of scatter from a two-dimensional slat configuration and a three-dimensional blended-wing-body demonstrate the capability of the method to efficiently treat realistic geometries.

  6. Convective penetration in stars

    NASA Astrophysics Data System (ADS)

    Pratt, Jane; Baraffe, Isabelle; Goffrey, Tom; Constantino, Tom; Popov, M. V.; Walder, Rolf; Folini, Doris; TOFU Collaboration

    To interpret the high-quality data produced from recent space-missions it is necessary to study convection under realistic stellar conditions. We describe the multi-dimensional, time implicit, fully compressible, hydrodynamic, implicit large eddy simulation code MUSIC, currently being developed at the University of Exeter. We use MUSIC to study convection during an early stage in the evolution of our sun where the convection zone covers approximately half of the solar radius. This model of the young sun possesses a realistic stratification in density, temperature, and luminosity. We approach convection in a stellar context using extreme value theory and derive a new model for convective penetration, targeted for one-dimensional stellar evolution calculations. The research leading to these results has received funding from the European Research Council under the European Union's Seventh Framework (FP7/2007-2013)/ERC Grant agreement no. 320478.

  7. Risk in Science Instruction. The Realist and Constructivist Paradigms of Risk

    NASA Astrophysics Data System (ADS)

    Hansen, Julia; Hammann, Marcus

    2017-11-01

    Risk is always present in people's lives: diseases, new technologies, socio-scientific issues (SSIs) such as climate change, and advances in medicine—to name just a few examples—all carry risks. To be able to navigate risks in everyday life, as well as to participate in social debate on risk-related issues, students need to develop risk competence. Science education can be a powerful tool in supporting students' risk competence, which is an important component of scientific literacy. As there are different definitions of risk within the scientific community, the aims of this article are (1) to review the literature on two major theoretical frameworks for conceptualising risk, the realist, and the constructivist paradigms of risk and (2) to connect both in order to suggest a working definition of what can be understood as risk competence in science instruction.

  8. Monte Carlo simulation of the operational quantities at the realistic mixed neutron-photon radiation fields CANEL and SIGMA.

    PubMed

    Lacoste, V; Gressier, V

    2007-01-01

    The Institute for Radiological Protection and Nuclear Safety owns two facilities producing realistic mixed neutron-photon radiation fields, CANEL, an accelerator driven moderator modular device, and SIGMA, a graphite moderated americium-beryllium assembly. These fields are representative of some of those encountered at nuclear workplaces, and the corresponding facilities are designed and used for calibration of various instruments, such as survey meters, personal dosimeters or spectrometric devices. In the framework of the European project EVIDOS, irradiations of personal dosimeters were performed at CANEL and SIGMA. Monte Carlo calculations were performed to estimate the reference values of the personal dose equivalent at both facilities. The Hp(10) values were calculated for three different angular positions, 0 degrees, 45 degrees and 75 degrees, of an ICRU phantom located at the position of irradiation.

  9. Healthy Cities Phase V evaluation: further synthesizing realism.

    PubMed

    de Leeuw, Evelyne; Green, Geoff; Tsouros, Agis; Dyakova, Mariana; Farrington, Jill; Faskunger, Johan; Grant, Marcus; Ison, Erica; Jackisch, Josephine; Lafond, Leah Janss; Lease, Helen; Mackiewicz, Karolina; Östergren, Per-Olof; Palmer, Nicola; Ritsatakis, Anna; Simos, Jean; Spanswick, Lucy; Webster, Premila; Zamaro, Gianna; Crown, June; Kickbusch, Ilona; Rasmussen, Niels; Scally, Gabriel; Biddle, Marian; Earl, Suzanne; Petersen, Connie; Devlin, Joan

    2015-06-01

    In this article we reflect on the quality of a realist synthesis paradigm applied to the evaluation of Phase V of the WHO European Healthy Cities Network. The programmatic application of this approach has led to very high response rates and a wealth of important data. All articles in this Supplement report that cities in the network move from small-scale, time-limited projects predominantly focused on health lifestyles to the significant inclusion of policies and programmes on systems and values for good health governance. The evaluation team felt that, due to time and resource limitations, it was unable to fully exploit the potential of realist synthesis. In particular, the synthetic integration of different strategic foci of Phase V designation areas did not come to full fruition. We recommend better and more sustained integration of realist synthesis in the practice of Healthy Cities in future Phases. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Benefits and Limitations of Real Options Analysis for the Practice of River Flood Risk Management

    NASA Astrophysics Data System (ADS)

    Kind, Jarl M.; Baayen, Jorn H.; Botzen, W. J. Wouter

    2018-04-01

    Decisions on long-lived flood risk management (FRM) investments are complex because the future is uncertain. Flexibility and robustness can be used to deal with future uncertainty. Real options analysis (ROA) provides a welfare-economics framework to design and evaluate robust and flexible FRM strategies under risk or uncertainty. Although its potential benefits are large, ROA is hardly used in todays' FRM practice. In this paper, we investigate benefits and limitations of a ROA, by applying it to a realistic FRM case study for an entire river branch. We illustrate how ROA identifies optimal short-term investments and values future options. We develop robust dike investment strategies and value the flexibility offered by additional room for the river measures. We benchmark the results of ROA against those of a standard cost-benefit analysis and show ROA's potential policy implications. The ROA for a realistic case requires a high level of geographical detail, a large ensemble of scenarios, and the inclusion of stakeholders' preferences. We found several limitations of applying the ROA. It is complex. In particular, relevant sources of uncertainty need to be recognized, quantified, integrated, and discretized in scenarios, requiring subjective choices and expert judgment. Decision trees have to be generated and stakeholders' preferences have to be translated into decision rules. On basis of this study, we give general recommendations to use high discharge scenarios for the design of measures with high fixed costs and few alternatives. Lower scenarios may be used when alternatives offer future flexibility.

  11. An exploration of group-based HIV/AIDS treatment and care models in Sub-Saharan Africa using a realist evaluation (Intervention-Context-Actor-Mechanism-Outcome) heuristic tool: a systematic review.

    PubMed

    Mukumbang, Ferdinand C; Van Belle, Sara; Marchal, Bruno; van Wyk, Brian

    2017-08-25

    It is increasingly acknowledged that differentiated care models hold potential to manage large volumes of patients on antiretroviral therapy (ART). Various group-based models of ART service delivery aimed at decongesting local health facilities, encouraging patient retention in care, and enhancing adherence to medication have been implemented across sub-Saharan Africa. Evidence from the literature suggests that these models of ART service delivery are more effective than corresponding facility-based care and superior to individual-based models. Nevertheless, there is little understanding of how these care models work to achieve their intended outcomes. The aim of this study was to review the theories explicating how and why group-based ART models work using a realist evaluation framework. A systematic review of the literature on group-based ART support models in sub-Saharan Africa was conducted. We searched the Google Scholar and PubMed databases and supplemented these with a reference chase of the identified articles. We applied a theory-driven approach-narrative synthesis-to synthesise the data. Data were analysed using the thematic content analysis method and synthesised according to aspects of the Intervention-Context-Actor-Mechanism-Outcome heuristic-analytic tool-a realist evaluation theory building tool. Twelve articles reporting primary studies on group-based models of ART service delivery were included in the review. The six studies that employed a quantitative study design failed to identify aspects of the context and mechanisms that work to trigger the outcomes of group-based models. While the other four studies that applied a qualitative and the two using a mixed methods design identified some of the aspects of the context and mechanisms that could trigger the outcomes of group-based ART models, these studies did not explain the relationship(s) between the theory elements and how they interact to produce the outcome(s). Although we could distill various components of the Intervention-Context-Actor-Mechanism-Outcome analytic tool from different studies exploring group-based programmes, we could not, however, identify a salient programme theory based on the Intervention-Context-Actor-Mechanism-Outcome heuristic analysis. The scientific community, policy makers and programme implementers would benefit more if explanatory findings of how, why, for whom and in what circumstances programmes work are presented rather than just reporting on the outcomes of the interventions.

  12. A Historical Forcing Ice Sheet Model Validation Framework for Greenland

    NASA Astrophysics Data System (ADS)

    Price, S. F.; Hoffman, M. J.; Howat, I. M.; Bonin, J. A.; Chambers, D. P.; Kalashnikova, I.; Neumann, T.; Nowicki, S.; Perego, M.; Salinger, A.

    2014-12-01

    We propose an ice sheet model testing and validation framework for Greenland for the years 2000 to the present. Following Perego et al. (2014), we start with a realistic ice sheet initial condition that is in quasi-equilibrium with climate forcing from the late 1990's. This initial condition is integrated forward in time while simultaneously applying (1) surface mass balance forcing (van Angelen et al., 2013) and (2) outlet glacier flux anomalies, defined using a new dataset of Greenland outlet glacier flux for the past decade (Enderlin et al., 2014). Modeled rates of mass and elevation change are compared directly to remote sensing observations obtained from GRACE and ICESat. Here, we present a detailed description of the proposed validation framework including the ice sheet model and model forcing approach, the model-to-observation comparison process, and initial results comparing model output and observations for the time period 2000-2013.

  13. A formal framework of scenario creation and analysis of extreme hydrological events

    NASA Astrophysics Data System (ADS)

    Lohmann, D.

    2007-12-01

    We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.

  14. From puddles to planet: modeling approaches to vector-borne diseases at varying resolution and scale.

    PubMed

    Eckhoff, Philip A; Bever, Caitlin A; Gerardin, Jaline; Wenger, Edward A; Smith, David L

    2015-08-01

    Since the original Ross-Macdonald formulations of vector-borne disease transmission, there has been a broad proliferation of mathematical models of vector-borne disease, but many of these models retain most to all of the simplifying assumptions of the original formulations. Recently, there has been a new expansion of mathematical frameworks that contain explicit representations of the vector life cycle including aquatic stages, multiple vector species, host heterogeneity in biting rate, realistic vector feeding behavior, and spatial heterogeneity. In particular, there are now multiple frameworks for spatially explicit dynamics with movements of vector, host, or both. These frameworks are flexible and powerful, but require additional data to take advantage of these features. For a given question posed, utilizing a range of models with varying complexity and assumptions can provide a deeper understanding of the answers derived from models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    NASA Astrophysics Data System (ADS)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  16. Autumn Algorithm-Computation of Hybridization Networks for Realistic Phylogenetic Trees.

    PubMed

    Huson, Daniel H; Linz, Simone

    2018-01-01

    A minimum hybridization network is a rooted phylogenetic network that displays two given rooted phylogenetic trees using a minimum number of reticulations. Previous mathematical work on their calculation has usually assumed the input trees to be bifurcating, correctly rooted, or that they both contain the same taxa. These assumptions do not hold in biological studies and "realistic" trees have multifurcations, are difficult to root, and rarely contain the same taxa. We present a new algorithm for computing minimum hybridization networks for a given pair of "realistic" rooted phylogenetic trees. We also describe how the algorithm might be used to improve the rooting of the input trees. We introduce the concept of "autumn trees", a nice framework for the formulation of algorithms based on the mathematics of "maximum acyclic agreement forests". While the main computational problem is hard, the run-time depends mainly on how different the given input trees are. In biological studies, where the trees are reasonably similar, our parallel implementation performs well in practice. The algorithm is available in our open source program Dendroscope 3, providing a platform for biologists to explore rooted phylogenetic networks. We demonstrate the utility of the algorithm using several previously studied data sets.

  17. Policy guidance on threats to legislative interventions in public health: a realist synthesis.

    PubMed

    Wong, Geoff; Pawson, Ray; Owen, Lesley

    2011-04-10

    Legislation is one of the most powerful weapons for improving population health and is often used by policy and decision makers. Little research exists to guide them as to whether legislation is feasible and/or will succeed. We aimed to produce a coherent and transferable evidence based framework of threats to legislative interventions to assist the decision making process and to test this through the 'case study' of legislation to ban smoking in cars carrying children. We conceptualised legislative interventions as a complex social interventions and so used the realist synthesis method to systematically review the literature for evidence. 99 articles were found through searches on five electronic databases (MEDLINE, HMIC, EMBASE, PsychINFO, Social Policy and Practice) and iterative purposive searching. Our initial searches sought any studies that contained information on smoking in vehicles carrying children. Throughout the review we continued where needed to search for additional studies of any type that would conceptually contribute to helping build and/or test our framework. Our framework identified a series of transferable threats to public health legislation. When applied to smoking bans in vehicles; problem misidentification; public support; opposition; and enforcement issues were particularly prominent threats. Our framework enabled us to understand and explain the nature of each threat and to infer the most likely outcome if such legislation were to be proposed in a jurisdiction where no such ban existed. Specifically, the micro-environment of a vehicle can contain highly hazardous levels of second hand smoke. Public support for such legislation is high amongst smokers and non-smokers and their underlying motivations were very similar - wanting to practice the Millian principle of protecting children from harm. Evidence indicated that the tobacco industry was not likely to oppose legislation and arguments that such a law would be 'unenforceable' were unfounded. It is possible to develop a coherent and transferable evidence based framework of the ideas and assumptions behind the threats to legislative intervention that may assist policy and decision makers to analyse and judge if legislation is feasible and/or likely to succeed.

  18. Policy guidance on threats to legislative interventions in public health: a realist synthesis

    PubMed Central

    2011-01-01

    Background Legislation is one of the most powerful weapons for improving population health and is often used by policy and decision makers. Little research exists to guide them as to whether legislation is feasible and/or will succeed. We aimed to produce a coherent and transferable evidence based framework of threats to legislative interventions to assist the decision making process and to test this through the 'case study' of legislation to ban smoking in cars carrying children. Methods We conceptualised legislative interventions as a complex social interventions and so used the realist synthesis method to systematically review the literature for evidence. 99 articles were found through searches on five electronic databases (MEDLINE, HMIC, EMBASE, PsychINFO, Social Policy and Practice) and iterative purposive searching. Our initial searches sought any studies that contained information on smoking in vehicles carrying children. Throughout the review we continued where needed to search for additional studies of any type that would conceptually contribute to helping build and/or test our framework. Results Our framework identified a series of transferable threats to public health legislation. When applied to smoking bans in vehicles; problem misidentification; public support; opposition; and enforcement issues were particularly prominent threats. Our framework enabled us to understand and explain the nature of each threat and to infer the most likely outcome if such legislation were to be proposed in a jurisdiction where no such ban existed. Specifically, the micro-environment of a vehicle can contain highly hazardous levels of second hand smoke. Public support for such legislation is high amongst smokers and non-smokers and their underlying motivations were very similar - wanting to practice the Millian principle of protecting children from harm. Evidence indicated that the tobacco industry was not likely to oppose legislation and arguments that such a law would be 'unenforceable' were unfounded. Conclusion It is possible to develop a coherent and transferable evidence based framework of the ideas and assumptions behind the threats to legislative intervention that may assist policy and decision makers to analyse and judge if legislation is feasible and/or likely to succeed. PMID:21477347

  19. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed methodology generates realistic fault network models conditioned to data and a conceptual model of the underlying tectonics.

  20. Felker's Five Keys to Self-Concept Enhancement: Secondary Classroom Research.

    ERIC Educational Resources Information Center

    Bernhoft, Franklin O.

    A study incorporated Donald Felker's 5 Keys to Self-Concept Enhancement in 20 minutes of timed writing weekly or bi-weekly for three months using the Coopersmith Adult Form as pre-post measure. Felker's 5 Keys are: (1) adults, praise yourselves; (2) help children evaluate realistically; (3) teach children to set realistic goals; (4) teach children…

  1. Project REALISTIC: Evaluation and Modification of REAding, LIStening, and ArithmeTIC Needs in Military Jobs Having Civilian Counterparts.

    ERIC Educational Resources Information Center

    Sticht, Thomas G.; And Others

    The papers in this collection present a description of, and the results of, research in Work Unit REALISTIC. In addition to the first paper which is an overview, the three papers are: "Psychometric Determination of Relationships Among Literacy Skills and Job Proficiency,""Reading Ability, Readability, and Readership: Identifying…

  2. The Effects of Talking-Head with Various Realism Levels on Students' Emotions in Learning

    ERIC Educational Resources Information Center

    Mohamad Ali, Ahmad Zamzuri; Hamdan, Mohd Najib

    2017-01-01

    The aim of this study was to evaluate the effects of various realistic levels of talking-head on students' emotions in pronunciation learning. Four talking-head characters with varying levels of realism were developed and tested: a nonrealistic three-dimensional character, a realistic three-dimensional character, a two-dimensional character, and…

  3. Multimodal biometrics for identity documents (MBioID).

    PubMed

    Dessimoz, Damien; Richiardi, Jonas; Champod, Christophe; Drygajlo, Andrzej

    2007-04-11

    The MBioID initiative has been set up to address the following germane question: What and how biometric technologies could be deployed in identity documents in the foreseeable future? This research effort proposes to look at current and future practices and systems of establishing and using biometric identity documents (IDs) and evaluate their effectiveness in large-scale developments. The first objective of the MBioID project is to present a review document establishing the current state-of-the-art related to the use of multimodal biometrics in an IDs application. This research report gives the main definitions, properties and the framework of use related to biometrics, an overview of the main standards developed in the biometric industry and standardisation organisations to ensure interoperability, as well as some of the legal framework and the issues associated to biometrics such as privacy and personal data protection. The state-of-the-art in terms of technological development is also summarised for a range of single biometric modalities (2D and 3D face, fingerprint, iris, on-line signature and speech), chosen according to ICAO recommendations and availabilities, and for various multimodal approaches. This paper gives a summary of the main elements of that report. The second objective of the MBioID project is to propose relevant acquisition and evaluation protocols for a large-scale deployment of biometric IDs. Combined with the protocols, a multimodal database will be acquired in a realistic way, in order to be as close as possible to a real biometric IDs deployment. In this paper, the issues and solutions related to the acquisition setup are briefly presented.

  4. Modeling Supermassive Black Holes in Cosmological Simulations

    NASA Astrophysics Data System (ADS)

    Tremmel, Michael

    My thesis work has focused on improving the implementation of supermassive black hole (SMBH) physics in cosmological hydrodynamic simulations. SMBHs are ubiquitous in mas- sive galaxies, as well as bulge-less galaxies and dwarfs, and are thought to be a critical component to massive galaxy evolution. Still, much is unknown about how SMBHs form, grow, and affect their host galaxies. Cosmological simulations are an invaluable tool for un- derstanding the formation of galaxies, self-consistently tracking their evolution with realistic merger and gas accretion histories. SMBHs are often modeled in these simulations (generally as a necessity to produce realistic massive galaxies), but their implementations are commonly simplified in ways that can limit what can be learned. Current and future observations are opening new windows into the lifecycle of SMBHs and their host galaxies, but require more detailed, physically motivated simulations. Within the novel framework I have developed, SMBHs 1) are seeded at early times without a priori assumptions of galaxy occupation, 2) grow in a way that accounts for the angular momentum of gas, and 3) experience realistic orbital evolution. I show how this model, properly tuned with a novel parameter optimiza- tion technique, results in realistic galaxies and SMBHs. Utilizing the unique ability of these simulations to capture the dynamical evolution of SMBHs, I present the first self-consistent prediction for the formation timescales of close SMBH pairs, precursors to SMBH binaries and merger events potentially detected by future gravitational wave experiments.

  5. How do you modernize a health service? A realist evaluation of whole-scale transformation in london.

    PubMed

    Greenhalgh, Trisha; Humphrey, Charlotte; Hughes, Jane; Macfarlane, Fraser; Butler, Ceri; Pawson, Ray

    2009-06-01

    Large-scale, whole-systems interventions in health care require imaginative approaches to evaluation that go beyond assessing progress against predefined goals and milestones. This project evaluated a major change effort in inner London, funded by a charitable donation of approximately $21 million, which spanned four large health care organizations, covered three services (stroke, kidney, and sexual health), and sought to "modernize" these services with a view to making health care more efficient, effective, and patient centered. This organizational case study draws on the principles of realist evaluation, a largely qualitative approach that is centrally concerned with testing and refining program theories by exploring the complex and dynamic interaction among context, mechanism, and outcome. This approach used multiple data sources and methods in a pragmatic and reflexive manner to build a picture of the case and follow its fortunes over the three-year study period. The methods included ethnographic observation, semistructured interviews, and scrutiny of documents and other contemporaneous materials. As well as providing ongoing formative feedback to the change teams in specific areas of activity, we undertook a more abstract, interpretive analysis, which explored the context-mechanism-outcome relationship using the guiding question "what works, for whom, under what circumstances?" In this example of large-scale service transformation, numerous projects and subprojects emerged, fed into one another, and evolved over time. Six broad mechanisms appeared to be driving the efforts of change agents: integrating services across providers, finding and using evidence, involving service users in the modernization effort, supporting self-care, developing the workforce, and extending the range of services. Within each of these mechanisms, different teams chose widely differing approaches and met with differing success. The realist analysis of the fortunes of different subprojects identified aspects of context and mechanism that accounted for observed outcomes (both intended and unintended). This study was one of the first applications of realist evaluation to a large-scale change effort in health care. Even when an ambitious change program shifts from its original goals and meets unforeseen challenges (indeed, precisely because the program morphs and adapts over time), realist evaluation can draw useful lessons about how particular preconditions make particular outcomes more likely, even though it cannot produce predictive guidance or a simple recipe for success. Noting recent calls by others for the greater use of realist evaluation in health care, this article considers some of the challenges and limitations of this method in the light of this experience and suggests that its use will require some fundamental changes in the worldview of some health services researchers.

  6. Thermodynamic power of non-Markovianity

    PubMed Central

    Bylicka, Bogna; Tukiainen, Mikko; Chruściński, Dariusz; Piilo, Jyrki; Maniscalco, Sabrina

    2016-01-01

    The natural framework to discuss thermodynamics at the quantum level is the theory of open quantum systems. Memory effects arising from strong system-environment correlations may lead to information back-flow, that is non-Markovian behaviour. The relation between non-Markovianity and quantum thermodynamics has been until now largely unexplored. Here we show by means of Landauer’s principle that memory effects control the amount of work extraction by erasure in presence of realistic environments. PMID:27323947

  7. A Realistic Framework for Delay-Tolerant Network Routing in Open Terrains with Continuous Churn

    NASA Astrophysics Data System (ADS)

    Mahendran, Veeramani; Anirudh, Sivaraman K.; Murthy, C. Siva Ram

    The conventional analysis of Delay-Tolerant Network (DTN) routing assumes that the terrain over which nodes move is closed implying that when the nodes hit a boundary, they either wrap around or get reflected. In this work, we study the effect of relaxing this closed terrain assumption on the routing performance, where a continuous stream of nodes enter the terrain and get absorbed upon hitting the boundary.

  8. Survey of Approaches to Generate Realistic Synthetic Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Lee, Sangkeun; Powers, Sarah S

    A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broadmore » set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.« less

  9. Challenges to Recruiting Population Representative Samples of Female Sex Workers in China Using Respondent Driven Sampling1

    PubMed Central

    Merli, M. Giovanna; Moody, James; Smith, Jeffrey; Li, Jing; Weir, Sharon; Chen, Xiangsheng

    2014-01-01

    We explore the network coverage of a sample of female sex workers (FSWs) in China recruited through Respondent Drive Sampling (RDS) as part of an effort to evaluate the claim of RDS of population representation with empirical data. We take advantage of unique information on the social networks of FSWs obtained from two overlapping studies --RDS and a venue-based sampling approach (PLACE) -- and use an exponential random graph modeling (ERGM) framework from local networks to construct a likely network from which our observed RDS sample is drawn. We then run recruitment chains over this simulated network to assess the assumption that the RDS chain referral process samples participants in proportion to their degree and the extent to which RDS satisfactorily covers certain parts of the network. We find evidence that, contrary to assumptions, RDS oversamples low degree nodes and geographically central areas of the network. Unlike previous evaluations of RDS which have explored the performance of RDS sampling chains on a non-hidden population, or the performance of simulated chains over previously mapped realistic social networks, our study provides a robust, empirically grounded evaluation of the performance of RDS chains on a real-world hidden population. PMID:24834869

  10. Patients-people-place: developing a framework for researching organizational culture during health service redesign and change.

    PubMed

    Gale, Nicola K; Shapiro, Jonathan; McLeod, Hugh S T; Redwood, Sabi; Hewison, Alistair

    2014-08-20

    Organizational culture is considered by policy-makers, clinicians, health service managers and researchers to be a crucial mediator in the success of implementing health service redesign. It is a challenge to find a method to capture cultural issues that is both theoretically robust and meaningful to those working in the organizations concerned. As part of a comparative study of service redesign in three acute hospital organizations in England, UK, a framework for collecting data reflective of culture was developed that was informed by previous work in the field and social and cultural theory. As part of a larger mixed method comparative case study of hospital service redesign, informed by realist evaluation, the authors developed a framework for researching organisational culture during health service redesign and change. This article documents the development of the model, which involved an iterative process of data analysis, critical interdisciplinary discussion in the research team, and feedback from staff in the partner organisations. Data from semi-structured interviews with 77 key informants are used to illustrate the model. In workshops with NHS partners to share and debate the early findings of the study, organizational culture was identified as a key concept to explore because it was perceived to underpin the whole redesign process. The Patients-People-Place framework for studying culture focuses on three thematic areas ('domains') and three levels of culture in which the data could be organised. The framework can be used to help explain the relationship between observable behaviours and cultural artefacts, the values and habits of social actors and the basic assumptions underpinning an organization's culture in each domain. This paper makes a methodological contribution to the study of culture in health care organizations. It offers guidance and a practical approach to investigating the inherently complex phenomenon of culture in hospital organizations. The Patients-People-Place framework could be applied in other settings as a means of ensuring the three domains and three levels that are important to an organization's culture are addressed in future health service research.

  11. The SMART personalised self-management system for congestive heart failure: results of a realist evaluation.

    PubMed

    Bartlett, Yvonne K; Haywood, Annette; Bentley, Claire L; Parker, Jack; Hawley, Mark S; Mountain, Gail A; Mawson, Susan

    2014-11-25

    Technology has the potential to provide support for self-management to people with congestive heart failure (CHF). This paper describes the results of a realist evaluation of the SMART Personalised Self-Management System (PSMS) for CHF. The PSMS was used, at home, by seven people with CHF. Data describing system usage and usability as well as questionnaire and interview data were evaluated in terms of the context, mechanism and outcome hypotheses (CMOs) integral to realist evaluation. The CHF PSMS improved heart failure related knowledge in those with low levels of knowledge at baseline, through providing information and quizzes. Furthermore, participants perceived the self-regulatory aspects of the CHF PSMS as being useful in encouraging daily walking. The CMOs were revised to describe the context of use, and how this influences both the mechanisms and the outcomes. Participants with CHF engaged with the PSMS despite some technological problems. Some positive effects on knowledge were observed as well as the potential to assist with changing physical activity behaviour. Knowledge of CHF and physical activity behaviour change are important self-management targets for CHF, and this study provides evidence to direct the further development of a technology to support these targets.

  12. Realistic simulations of a cyclotron spiral inflector within a particle-in-cell framework

    NASA Astrophysics Data System (ADS)

    Winklehner, Daniel; Adelmann, Andreas; Gsell, Achim; Kaman, Tulin; Campo, Daniela

    2017-12-01

    We present an upgrade to the particle-in-cell ion beam simulation code opal that enables us to run highly realistic simulations of the spiral inflector system of a compact cyclotron. This upgrade includes a new geometry class and field solver that can handle the complicated boundary conditions posed by the electrode system in the central region of the cyclotron both in terms of particle termination, and calculation of self-fields. Results are benchmarked against the analytical solution of a coasting beam. As a practical example, the spiral inflector and the first revolution in a 1 MeV /amu test cyclotron, located at Best Cyclotron Systems, Inc., are modeled and compared to the simulation results. We find that opal can now handle arbitrary boundary geometries with relative ease. Simulated injection efficiencies and beam shape compare well with measured efficiencies and a preliminary measurement of the beam distribution after injection.

  13. Prevention implications of AIDS discourses among South African women.

    PubMed

    Strebel, A

    1996-08-01

    Social constructionist and feminist analyses have done much to extend the understanding of AIDS beyond the biomedical to include social accounts of the constitution of AIDS knowledge and meanings. However, these frameworks have not translated easily into realistic responses to the paradox of women being seen as responsible for HIV prevention, while they lack the power to implement safe sex behavior. This study explores the range and interplay of discursive themes which South African women drew on regarding AIDS and identifies constraints and opportunities for realistic prevention. The research involved 14 focus group discussions with women. Two main interpretative repertoires regarding AIDS were identified from the texts: one concerning the medicalization and the other the stigmatization of the disease. Although these representations were not unchallenged, the pervasive sense was of denial of own risk, fear, and fatalism. However, the analysis highlighted the complexity of issues to be faced in developing effective prevention initiatives.

  14. SiC-based neutron detector in quasi-realistic working conditions: efficiency and stability at room and high temperature under fast neutron irradiations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferone, Raffaello; Issa, Fatima; Ottaviani, Laurent

    In the framework of the European I SMART project, we have designed and made new SiC-based nuclear radiation detectors able to operate in harsh environments and to detect both fast and thermal neutrons. In this paper, we report experimental results of fast neutron irradiation campaign at high temperature (106 deg. C) in quasi-realistic working conditions. Our device does not suffer from high temperature, and spectra do show strong stability, preserving features. These experiments, as well as others in progress, show the I SMART SiC-based device skills to operate in harsh environments, whereas other materials would strongly suffer from degradation. Workmore » is still demanded to test our device at higher temperatures and to enhance efficiency in order to make our device fully exploitable from an industrial point of view. (authors)« less

  15. The Direct Lighting Computation in Global Illumination Methods

    NASA Astrophysics Data System (ADS)

    Wang, Changyaw Allen

    1994-01-01

    Creating realistic images is a computationally expensive process, but it is very important for applications such as interior design, product design, education, virtual reality, and movie special effects. To generate realistic images, state-of-art rendering techniques are employed to simulate global illumination, which accounts for the interreflection of light among objects. In this document, we formalize the global illumination problem into a eight -dimensional integral and discuss various methods that can accelerate the process of approximating this integral. We focus on the direct lighting computation, which accounts for the light reaching the viewer from the emitting sources after exactly one reflection, Monte Carlo sampling methods, and light source simplification. Results include a new sample generation method, a framework for the prediction of the total number of samples used in a solution, and a generalized Monte Carlo approach for computing the direct lighting from an environment which for the first time makes ray tracing feasible for highly complex environments.

  16. BeamDyn: A High-Fidelity Wind Turbine Blade Solver in the FAST Modular Framework: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Q.; Sprague, M.; Jonkman, J.

    2015-01-01

    BeamDyn, a Legendre-spectral-finite-element implementation of geometrically exact beam theory (GEBT), was developed to meet the design challenges associated with highly flexible composite wind turbine blades. In this paper, the governing equations of GEBT are reformulated into a nonlinear state-space form to support its coupling within the modular framework of the FAST wind turbine computer-aided engineering (CAE) tool. Different time integration schemes (implicit and explicit) were implemented and examined for wind turbine analysis. Numerical examples are presented to demonstrate the capability of this new beam solver. An example analysis of a realistic wind turbine blade, the CX-100, is also presented asmore » validation.« less

  17. State Event Models for the Formal Analysis of Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles

    2014-01-01

    The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.

  18. An artificial intelligence framework for compensating transgressions and its application to diet management.

    PubMed

    Anselma, Luca; Mazzei, Alessandro; De Michieli, Franco

    2017-04-01

    Today, there is considerable interest in personal healthcare. The pervasiveness of technology allows to precisely track human behavior; however, when dealing with the development of an intelligent assistant exploiting data acquired through such technologies, a critical issue has to be taken into account; namely, that of supporting the user in the event of any transgression with respect to the optimal behavior. In this paper we present a reasoning framework based on Simple Temporal Problems that can be applied to a general class of problems, which we called cake&carrot problems, to support reasoning in presence of human transgression. The reasoning framework offers a number of facilities to ensure a smart management of possible "wrong behaviors" by a user to reach the goals defined by the problem. This paper describes the framework by means of the prototypical use case of diet domain. Indeed, following a healthy diet can be a difficult task for both practical and psychological reasons and dietary transgressions are hard to avoid. Therefore, the framework is tolerant to dietary transgressions and adapts the following meals to facilitate users in recovering from such transgressions. Finally, through a simulation involving a real hospital menu, we show that the framework can effectively achieve good results in a realistic scenario. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. A performance study of WebDav access to storages within the Belle II collaboration

    NASA Astrophysics Data System (ADS)

    Pardi, S.; Russo, G.

    2017-10-01

    WebDav and HTTP are becoming popular protocols for data access in the High Energy Physics community. The most used Grid and Cloud storage solutions provide such kind of interfaces, in this scenario tuning and performance evaluation became crucial aspects to promote the adoption of these protocols within the Belle II community. In this work, we present the results of a large-scale test activity, made with the goal to evaluate performances and reliability of the WebDav protocol, and study a possible adoption for the user analysis. More specifically, we considered a pilot infrastructure composed by a set of storage elements configured with the WebDav interface, hosted at the Belle II sites. The performance tests include a comparison with xrootd and gridftp. As reference tests we used a set of analysis jobs running under the Belle II software framework, accessing the input data with the ROOT I/O library, in order to simulate as much as possible a realistic user activity. The final analysis shows the possibility to achieve promising performances with WebDav on different storage systems, and gives an interesting feedback, for Belle II community and for other high energy physics experiments.

  20. Wavelet-based localization of oscillatory sources from magnetoencephalography data.

    PubMed

    Lina, J M; Chowdhury, R; Lemay, E; Kobayashi, E; Grova, C

    2014-08-01

    Transient brain oscillatory activities recorded with Eelectroencephalography (EEG) or magnetoencephalography (MEG) are characteristic features in physiological and pathological processes. This study is aimed at describing, evaluating, and illustrating with clinical data a new method for localizing the sources of oscillatory cortical activity recorded by MEG. The method combines time-frequency representation and an entropic regularization technique in a common framework, assuming that brain activity is sparse in time and space. Spatial sparsity relies on the assumption that brain activity is organized among cortical parcels. Sparsity in time is achieved by transposing the inverse problem in the wavelet representation, for both data and sources. We propose an estimator of the wavelet coefficients of the sources based on the maximum entropy on the mean (MEM) principle. The full dynamics of the sources is obtained from the inverse wavelet transform, and principal component analysis of the reconstructed time courses is applied to extract oscillatory components. This methodology is evaluated using realistic simulations of single-trial signals, combining fast and sudden discharges (spike) along with bursts of oscillating activity. The method is finally illustrated with a clinical application using MEG data acquired on a patient with a right orbitofrontal epilepsy.

  1. A realist review of mobile phone-based health interventions for non-communicable disease management in sub-Saharan Africa.

    PubMed

    Opoku, Daniel; Stephani, Victor; Quentin, Wilm

    2017-02-06

    The prevalence of non-communicable diseases (NCDs) is increasing in sub-Saharan Africa. At the same time, the use of mobile phones is rising, expanding the opportunities for the implementation of mobile phone-based health (mHealth) interventions. This review aims to understand how, why, for whom, and in what circumstances mHealth interventions against NCDs improve treatment and care in sub-Saharan Africa. Four main databases (PubMed, Cochrane Library, Web of Science, and Google Scholar) and references of included articles were searched for studies reporting effects of mHealth interventions on patients with NCDs in sub-Saharan Africa. All studies published up until May 2015 were included in the review. Following a realist review approach, middle-range theories were identified and integrated into a Framework for Understanding the Contribution of mHealth Interventions to Improved Access to Care for patients with NCDs in sub-Saharan Africa. The main indicators of the framework consist of predisposing characteristics, needs, enabling resources, perceived usefulness, and perceived ease of use. Studies were analyzed in depth to populate the framework. The search identified 6137 titles for screening, of which 20 were retained for the realist synthesis. The contribution of mHealth interventions to improved treatment and care is that they facilitate (remote) access to previously unavailable (specialized) services. Three contextual factors (predisposing characteristics, needs, and enabling resources) influence if patients and providers believe that mHealth interventions are useful and easy to use. Only if they believe mHealth to be useful and easy to use, will mHealth ultimately contribute to improved access to care. The analysis of included studies showed that the most important predisposing characteristics are a positive attitude and a common language of communication. The most relevant needs are a high burden of disease and a lack of capacity of first-contact providers. Essential enabling resources are the availability of a stable communications network, accessible maintenance services, and regulatory policies. Policy makers and program managers should consider predisposing characteristics and needs of patients and providers as well as the necessary enabling resources prior to the introduction of an mHealth intervention. Researchers would benefit from placing greater attention on the context in which mHealth interventions are being implemented instead of focusing (too strongly) on the technical aspects of these interventions.

  2. Intracochlear pressure measurements to study bone conduction transmission: State-of-the art and proof of concept of the experimental Procedure

    NASA Astrophysics Data System (ADS)

    Borgers, Charlotte; van Wieringen, Astrid; D'hondt, Christiane; Verhaert, Nicolas

    2018-05-01

    The cochlea is the main contributor in bone conduction perception. Measurements of differential pressure in the cochlea give a good estimation of the cochlear input provided by bone conduction stimulation. Recent studies have proven the feasibility of intracochlear pressure measurements in chinchillas and in human temporal bones to study bone conduction. However, similar measurements in fresh-frozen whole human cadaveric heads could give a more realistic representation of the five different transmission pathways of bone conduction to the cochlea compared to human temporal bones. The aim of our study is to develop and validate a framework for intracochlear pressure measurements to evaluate different aspects of bone conduction in whole human cadaveric heads. A proof of concept describing our experimental setup is provided together with the procedure. Additionally, we also present a method to fix the stapes footplate in order to simulate otosclerosis in human temporal bones. The effectiveness of this method is verified by some preliminary results.

  3. Carbon footprint analysis as a tool for energy and environmental management in small and medium-sized enterprises

    NASA Astrophysics Data System (ADS)

    Giama, E.; Papadopoulos, A. M.

    2018-01-01

    The reduction of carbon emissions has become a top priority in the decision-making process for governments and companies, the strict European legislation framework being a major driving force behind this effort. On the other hand, many companies face difficulties in estimating their footprint and in linking the results derived from environmental evaluation processes with an integrated energy management strategy, which will eventually lead to energy-efficient and cost-effective solutions. The paper highlights the need of companies to establish integrated environmental management practices, with tools such as carbon footprint analysis to monitor the energy performance of production processes. Concepts and methods are analysed, and selected indicators are presented by means of benchmarking, monitoring and reporting the results in order to be used effectively from the companies. The study is based on data from more than 90 Greek small and medium enterprises, followed by a comprehensive discussion of cost-effective and realistic energy-saving measures.

  4. Mapping remodeling of thalamocortical projections in the living reeler mouse brain by diffusion tractography

    PubMed Central

    Harsan, Laura-Adela; Dávid, Csaba; Reisert, Marco; Schnell, Susanne; Hennig, Jürgen; von Elverfeldt, Dominik; Staiger, Jochen F.

    2013-01-01

    A major challenge in neuroscience is to accurately decipher in vivo the entire brain circuitry (connectome) at a microscopic level. Currently, the only methodology providing a global noninvasive window into structural brain connectivity is diffusion tractography. The extent to which the reconstructed pathways reflect realistic neuronal networks depends, however, on data acquisition and postprocessing factors. Through a unique combination of approaches, we designed and evaluated herein a framework for reliable fiber tracking and mapping of the living mouse brain connectome. One important wiring scheme, connecting gray matter regions and passing fiber-crossing areas, was closely examined: the lemniscal thalamocortical (TC) pathway. We quantitatively validated the TC projections inferred from in vivo tractography with correlative histological axonal tracing in the same wild-type and reeler mutant mice. We demonstrated noninvasively that changes in patterning of the cortical sheet, such as highly disorganized cortical lamination in reeler, led to spectacular compensatory remodeling of the TC pathway. PMID:23610438

  5. Real-time simulation of soft tissue deformation and electrocautery procedures in laparoscopic rectal cancer radical surgery.

    PubMed

    Sui, Yuan; Pan, Jun J; Qin, Hong; Liu, Hao; Lu, Yun

    2017-12-01

    Laparoscopic surgery (LS), also referred to as minimally invasive surgery, is a modern surgical technique which is widely applied. The fulcrum effect makes LS a non-intuitive motor skill with a steep learning curve. A hybrid model of tetrahedrons and a multi-layer triangular mesh are constructed to simulate the deformable behavior of the rectum and surrounding tissues in the Position-Based Dynamics (PBD) framework. A heat-conduction based electric-burn technique is employed to simulate the electrocautery procedure. The simulator has been applied for laparoscopic rectum cancer surgery training. From the experimental results, trainees can operate in real time with high degrees of stability and fidelity. A preliminary study was performed to evaluate the realism and usefulness. This prototype simulator has been tested and verified by colorectal surgeons through a pilot study. They believed both the visual and the haptic performance of the simulation are realistic and helpful to enhance laparoscopic skills. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Multiscale Modeling of UHTC: Thermal Conductivity

    NASA Technical Reports Server (NTRS)

    Lawson, John W.; Murry, Daw; Squire, Thomas; Bauschlicher, Charles W.

    2012-01-01

    We are developing a multiscale framework in computational modeling for the ultra high temperature ceramics (UHTC) ZrB2 and HfB2. These materials are characterized by high melting point, good strength, and reasonable oxidation resistance. They are candidate materials for a number of applications in extreme environments including sharp leading edges of hypersonic aircraft. In particular, we used a combination of ab initio methods, atomistic simulations and continuum computations to obtain insights into fundamental properties of these materials. Ab initio methods were used to compute basic structural, mechanical and thermal properties. From these results, a database was constructed to fit a Tersoff style interatomic potential suitable for atomistic simulations. These potentials were used to evaluate the lattice thermal conductivity of single crystals and the thermal resistance of simple grain boundaries. Finite element method (FEM) computations using atomistic results as inputs were performed with meshes constructed on SEM images thereby modeling the realistic microstructure. These continuum computations showed the reduction in thermal conductivity due to the grain boundary network.

  7. Turbine blade-tip clearance excitation forces

    NASA Technical Reports Server (NTRS)

    Martinez-Sanchez, M.; Greitzer, E. M.

    1985-01-01

    The results of an effort to assess the existing knowledge and plan the required experimentation in the area of turbine blade tip excitation forces is summarized. The work was carried out in three phases. The first was a literature search and evaluation, which served to highlight the state of the art and to expose the need for an articulated theoretical experimental effort to provide not only design data, but also a rational framework for their extrapolation to new configurations and regimes. The second phase was a start in this direction, in which several of the explicit or implicit assumptions contained in the usual formulations of the Alford force effect were removed and a rigorous linearized flow analysis of the behavior of a nonsymmetric actuator disc was carried out. In the third phase a preliminary design of a turbine test facility that would be used to measure both the excitation forces themselves and the flow patterns responsible for them were conducted and do so over a realistic range of dimensionless parameters.

  8. Teaching clinical reasoning and decision-making skills to nursing students: Design, development, and usability evaluation of a serious game.

    PubMed

    Johnsen, Hege Mari; Fossum, Mariann; Vivekananda-Schmidt, Pirashanthie; Fruhling, Ann; Slettebø, Åshild

    2016-10-01

    Serious games (SGs) are a type of simulation technology that may provide nursing students with the opportunity to practice their clinical reasoning and decision-making skills in a safe and authentic environment. Despite the growing number of SGs developed for healthcare professionals, few SGs are video based or address the domain of home health care. This paper aims to describe the design, development, and usability evaluation of a video based SG for teaching clinical reasoning and decision-making skills to nursing students who care for patients with chronic obstructive pulmonary disease (COPD) in home healthcare settings. A prototype SG was developed. A unified framework of usability called TURF (Task, User, Representation, and Function) and SG theory were employed to ensure a user-centered design. The educational content was based on the clinical decision-making model, Bloom's taxonomy, and a Bachelor of Nursing curriculum. A purposeful sample of six participants evaluated the SG prototype in a usability laboratory. Cognitive walkthrough evaluations, a questionnaire, and individual interviews were used for the usability evaluation. The data were analyzed using qualitative deductive content analysis based on the TURF framework elements and related usability heuristics. The SG was perceived as being realistic, clinically relevant, and at an adequate level of complexity for the intended users. Usability issues regarding functionality and the user-computer interface design were identified. However, the SG was perceived as being easy to learn, and participants suggested that the SG could serve as a supplement to traditional training in laboratory and clinical settings. Using video based scenarios with an authentic COPD patient and a home healthcare registered nurse as actors contributed to increased realism. Using different theoretical approaches in the SG design was considered an advantage of the design process. The SG was perceived as being useful, usable, and satisfying. The achievement of the desired functionality and the minimization of user-computer interface issues emphasize the importance of conducting a usability evaluation during the SG development process. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Job and Work Evaluation: A Literature Review.

    ERIC Educational Resources Information Center

    Heneman, Robert L.

    2003-01-01

    Describes advantages and disadvantages of work evaluation methods: ranking, market pricing, banding, classification, single-factor, competency, point-factor, and factor comparison. Compares work evaluation perspectives: traditional, realist, market advocate, strategist, organizational development, social reality, contingency theory, competency,…

  10. Conformal completion of the standard model with a fourth generation

    NASA Astrophysics Data System (ADS)

    Ho, Chiu Man; Hung, Pham Q.; Kephart, Thomas W.

    2012-06-01

    We study dynamical electroweak symmetry breaking with a fourth generation within the Z n orbifolded AdS 5 ⊗ S 5 framework. A realistic Z 7 example is discussed. The initial theory reduces dynamically, due to the induced condensates, to a four-family trinification near a TeV-scale conformal fixed point where the gauge hierarchy problem does not exist. We predict new gauge bosons and bifundamental fermions and scalars accessible by the LHC.

  11. Palliating patients who have unresectable colorectal cancer: creating the right framework and salient symptom management.

    PubMed

    Dunn, Geoffrey P

    2006-08-01

    The last phases of colorectal malignant illness may be the most challenging and saddening for all involved, but they offer opportunities to become the most rewarding. This transformation of hopelessness to fulfillment requires a willingness by surgeon, patient, and patient's family to trust one another to realistically set goals of care, stick together, and not let the treatment of the disease become a surrogate for treating the suffering that characterizes grave illness.

  12. Interannual to Decadal Variability of Ocean Evaporation as Viewed from Climate Reanalyses

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Bosilovich, Michael G.; Roberts, Jason B.; Wang, Hailan

    2015-01-01

    Questions we'll address: Given the uncoupled framework of "AMIP" (Atmosphere Model Inter-comparison Project) experiments, what can they tell us regarding evaporation variability? Do Reduced Observations Reanalyses (RedObs) using Surface Fluxes and Clouds (SFC) pressure (and wind) provide a more realistic picture of evaporation variability? What signals of interannual variability (e.g. El Nino/Southern Oscillation (ENSO)) and decadal variability (Interdecadal Pacific Oscillation (IPO)) are detectable with this hierarchy of evaporation estimates?

  13. Sparse coding for flexible, robust 3D facial-expression synthesis.

    PubMed

    Lin, Yuxu; Song, Mingli; Quynh, Dao Thi Phuong; He, Ying; Chen, Chun

    2012-01-01

    Computer animation researchers have been extensively investigating 3D facial-expression synthesis for decades. However, flexible, robust production of realistic 3D facial expressions is still technically challenging. A proposed modeling framework applies sparse coding to synthesize 3D expressive faces, using specified coefficients or expression examples. It also robustly recovers facial expressions from noisy and incomplete data. This approach can synthesize higher-quality expressions in less time than the state-of-the-art techniques.

  14. International cooperation in the Space Station programme - Assessing the experience to date

    NASA Technical Reports Server (NTRS)

    Logsdon, John M.

    1991-01-01

    The origins and framework for cooperation in the Space Station program are outlined. Particular attention is paid to issues and commitments between the countries and to the political context of the Station partnership. A number of conclusions concerning international cooperation in space are drawn based on the Space Station experience. Among these conclusions is the assertion that an international partnership requires realistic assesments, mutual trust, and strong commitments in order to work.

  15. Why Is the Moon Synchronously Rotating?

    DTIC Science & Technology

    2013-06-19

    and a retrograde initial rotation. Key words: Moon – planets and satellites: dynamical evolution and stability. 1 IN T RO D U C T I O N The origin of...tides, which should not be used for planets and moons of terrestrial composition (Efroimsky & Makarov 2013). In recent years, a more realistic model...Efroimsky & Williams 2009; Efroimsky 2012). In the framework of this model, the capture of Mercury into the current 3:2 spin– orbit resonance becomes a

  16. Designing Intelligent Computer Aided Instruction Systems with Integrated Knowledge Representation Schemes

    DTIC Science & Technology

    1990-06-01

    the form of structured objects was first pioneered by Marvin Minsky . In his seminal article " A Framework for Representing Knowl- edge" he introduced... Minsky felt that the existing methods of knowledge representation were too finely grained and he proposed that knowledge is more than just a...not work" in realistic, complex domains. ( Minsky , 1981, pp. 95-128) According to Minsky "A frame is a data-structure for representing a stereo- typed

  17. Final Technical Report summarizing Purdue research activities as part of the DOE JET Topical Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molnar, Denes

    2015-09-01

    This report summarizes research activities at Purdue University done as part of the DOE JET Topical Collaboration. These mainly involve calculation of covariant radiative energy loss in the (Djordjevic-)Gyulassy-Levai-Vitev ((D)GLV) framework for relativistic A+A reactions at RHIC and LHC energies using realistic bulk medium evolution with both transverse and longitudinal expansion. The single PDF file provided also includes a report from the entire JET Collaboration.

  18. A realistic 3+1D Viscous Hydro Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romatschke, Paul

    2015-05-31

    DoE funds were used as bridge funds for the faculty position for the PI at the University of Colorado. The total funds for the Years 3-5 of the JET Topical Collaboration amounted to about 50 percent of the academic year salary of the PI.The PI contributed to the JET Topical Collaboration by developing, testing and applying algorithms for a realistic simulation of the bulk medium created in relativistic ion collisions.Specifically, two approaches were studied, one based on a new Lattice-Boltzmann (LB) framework, and one on a more traditional viscous hydro-dynamics framework. Both approaches were found to be viable in principle,more » with the LB approach being more elegant but needing still more time to develop.The traditional approach led to the super-hybrid model of ion collisions dubbed 'superSONIC', and has been successfully used for phenomenology of relativistic heavy-ion and light-on-heavy-ion collisions.In the time-frame of the JET Topical Collaboration, the Colorado group has published 15 articles in peer-reviewed journals, three of which were published in Physical Review Letters. The group graduated one Master student during this time-frame and two more PhD students are expected to graduate in the next few years. The PI has given more than 28 talks and presentations during this period.« less

  19. Real-time simulation of combined short-wave and long-wave infrared vision on a head-up display

    NASA Astrophysics Data System (ADS)

    Peinecke, Niklas; Schmerwitz, Sven

    2014-05-01

    Landing under adverse weather conditions can be challenging, even if the airfields are well known to the pilots. This is true for civil as well as military aviation. Within the scope of this paper we concentrate especially on fog conditions. The work has been conducted within the project ALICIA. ALICIA is a research and development project co-funded by European Commission under the Seventh Framework Programme. ALICIA aims at developing new and scalable cockpit applications which can extend operations of aircraft in degraded conditions: All Conditions Operations. One of the systems developed is a head-up display that can display a generated symbology together with a raster-mode infrared image. We will detail how we implemented a real-time enabled simulation of a combined short-wave and long-wave infrared image for landing. A major challenge was to integrate several already existing simulation solutions, e.g., for visual simulation and sensors with the required data-bases. For the simulations DLRs in-house sensor simulation framework F3S was used, together with a commercially available airport model that had to be heavily modified in order to provide realistic infrared data. Special effort was invested for a realistic impression of runway lighting under foggy conditions. We will present results and sketch further improvements for future simulations.

  20. Predicting electromyographic signals under realistic conditions using a multiscale chemo-electro-mechanical finite element model.

    PubMed

    Mordhorst, Mylena; Heidlauf, Thomas; Röhrle, Oliver

    2015-04-06

    This paper presents a novel multiscale finite element-based framework for modelling electromyographic (EMG) signals. The framework combines (i) a biophysical description of the excitation-contraction coupling at the half-sarcomere level, (ii) a model of the action potential (AP) propagation along muscle fibres, (iii) a continuum-mechanical formulation of force generation and deformation of the muscle, and (iv) a model for predicting the intramuscular and surface EMG. Owing to the biophysical description of the half-sarcomere, the model inherently accounts for physiological properties of skeletal muscle. To demonstrate this, the influence of membrane fatigue on the EMG signal during sustained contractions is investigated. During a stimulation period of 500 ms at 100 Hz, the predicted EMG amplitude decreases by 40% and the AP propagation velocity decreases by 15%. Further, the model can take into account contraction-induced deformations of the muscle. This is demonstrated by simulating fixed-length contractions of an idealized geometry and a model of the human tibialis anterior muscle (TA). The model of the TA furthermore demonstrates that the proposed finite element model is capable of simulating realistic geometries, complex fibre architectures, and can include different types of heterogeneities. In addition, the TA model accounts for a distributed innervation zone, different fibre types and appeals to motor unit discharge times that are based on a biophysical description of the α motor neurons.

  1. Predicting electromyographic signals under realistic conditions using a multiscale chemo–electro–mechanical finite element model

    PubMed Central

    Mordhorst, Mylena; Heidlauf, Thomas; Röhrle, Oliver

    2015-01-01

    This paper presents a novel multiscale finite element-based framework for modelling electromyographic (EMG) signals. The framework combines (i) a biophysical description of the excitation–contraction coupling at the half-sarcomere level, (ii) a model of the action potential (AP) propagation along muscle fibres, (iii) a continuum-mechanical formulation of force generation and deformation of the muscle, and (iv) a model for predicting the intramuscular and surface EMG. Owing to the biophysical description of the half-sarcomere, the model inherently accounts for physiological properties of skeletal muscle. To demonstrate this, the influence of membrane fatigue on the EMG signal during sustained contractions is investigated. During a stimulation period of 500 ms at 100 Hz, the predicted EMG amplitude decreases by 40% and the AP propagation velocity decreases by 15%. Further, the model can take into account contraction-induced deformations of the muscle. This is demonstrated by simulating fixed-length contractions of an idealized geometry and a model of the human tibialis anterior muscle (TA). The model of the TA furthermore demonstrates that the proposed finite element model is capable of simulating realistic geometries, complex fibre architectures, and can include different types of heterogeneities. In addition, the TA model accounts for a distributed innervation zone, different fibre types and appeals to motor unit discharge times that are based on a biophysical description of the α motor neurons. PMID:25844148

  2. Positivism and Realism in the Writings of Moritz Schlick

    NASA Astrophysics Data System (ADS)

    Lewis, Joia A.

    1990-01-01

    Moritz Schlick, the founder of the Vienna Circle, is best known for his logical positivist writings of the late 20's and early 30's. He is traditionally seen as having dropped his earlier realist views for an anti-realist positivism. This picture obscures both the complexity and profundity of Schlick's own philosophical development, as well as important issues in understanding positivist and realist assumptions in current debates among philosophers of science. This dissertation seeks to contribute to the contemporary debates through an analysis of Schlick's early as well as his later work, through an examination and evaluation of the principles and motives upon which Schlick based his claims.

  3. An agent-based hydroeconomic model to evaluate water policies in Jordan

    NASA Astrophysics Data System (ADS)

    Yoon, J.; Gorelick, S.

    2014-12-01

    Modern water systems can be characterized by a complex network of institutional and private actors that represent competing sectors and interests. Identifying solutions to enhance water security in such systems calls for analysis that can adequately account for this level of complexity and interaction. Our work focuses on the development of a hierarchical, multi-agent, hydroeconomic model that attempts to realistically represent complex interactions between hydrologic and multi-faceted human systems. The model is applied to Jordan, one of the most water-poor countries in the world. In recent years, the water crisis in Jordan has escalated due to an ongoing drought and influx of refugees from regional conflicts. We adopt a modular approach in which biophysical modules simulate natural and engineering phenomena, and human modules represent behavior at multiple scales of decision making. The human modules employ agent-based modeling, in which agents act as autonomous decision makers at the transboundary, state, organizational, and user levels. A systematic nomenclature and conceptual framework is used to characterize model agents and modules. Concepts from the Unified Modeling Language (UML) are adopted to promote clear conceptualization of model classes and process sequencing, establishing a foundation for full deployment of the integrated model in a scalable object-oriented programming environment. Although the framework is applied to the Jordanian water context, it is generalizable to other regional human-natural freshwater supply systems.

  4. Multicriteria Decision Framework for Cybersecurity Risk Assessment and Management.

    PubMed

    Ganin, Alexander A; Quach, Phuoc; Panwar, Mahesh; Collier, Zachary A; Keisler, Jeffrey M; Marchese, Dayton; Linkov, Igor

    2017-09-05

    Risk assessors and managers face many difficult challenges related to novel cyber systems. Among these challenges are the constantly changing nature of cyber systems caused by technical advances, their distribution across the physical, information, and sociocognitive domains, and the complex network structures often including thousands of nodes. Here, we review probabilistic and risk-based decision-making techniques applied to cyber systems and conclude that existing approaches typically do not address all components of the risk assessment triplet (threat, vulnerability, consequence) and lack the ability to integrate across multiple domains of cyber systems to provide guidance for enhancing cybersecurity. We present a decision-analysis-based approach that quantifies threat, vulnerability, and consequences through a set of criteria designed to assess the overall utility of cybersecurity management alternatives. The proposed framework bridges the gap between risk assessment and risk management, allowing an analyst to ensure a structured and transparent process of selecting risk management alternatives. The use of this technique is illustrated for a hypothetical, but realistic, case study exemplifying the process of evaluating and ranking five cybersecurity enhancement strategies. The approach presented does not necessarily eliminate biases and subjectivity necessary for selecting countermeasures, but provides justifiable methods for selecting risk management actions consistent with stakeholder and decisionmaker values and technical data. Published 2017. This article is a U.S. Government work and is in the public domain in the U.S.A.

  5. Human Behavior Analysis by Means of Multimodal Context Mining

    PubMed Central

    Banos, Oresti; Villalonga, Claudia; Bang, Jaehun; Hur, Taeho; Kang, Donguk; Park, Sangbeom; Huynh-The, Thien; Le-Ba, Vui; Amin, Muhammad Bilal; Razzaq, Muhammad Asif; Khan, Wahajat Ali; Hong, Choong Seon; Lee, Sungyoung

    2016-01-01

    There is sufficient evidence proving the impact that negative lifestyle choices have on people’s health and wellness. Changing unhealthy behaviours requires raising people’s self-awareness and also providing healthcare experts with a thorough and continuous description of the user’s conduct. Several monitoring techniques have been proposed in the past to track users’ behaviour; however, these approaches are either subjective and prone to misreporting, such as questionnaires, or only focus on a specific component of context, such as activity counters. This work presents an innovative multimodal context mining framework to inspect and infer human behaviour in a more holistic fashion. The proposed approach extends beyond the state-of-the-art, since it not only explores a sole type of context, but also combines diverse levels of context in an integral manner. Namely, low-level contexts, including activities, emotions and locations, are identified from heterogeneous sensory data through machine learning techniques. Low-level contexts are combined using ontological mechanisms to derive a more abstract representation of the user’s context, here referred to as high-level context. An initial implementation of the proposed framework supporting real-time context identification is also presented. The developed system is evaluated for various realistic scenarios making use of a novel multimodal context open dataset and data on-the-go, demonstrating prominent context-aware capabilities at both low and high levels. PMID:27517928

  6. Human Behavior Analysis by Means of Multimodal Context Mining.

    PubMed

    Banos, Oresti; Villalonga, Claudia; Bang, Jaehun; Hur, Taeho; Kang, Donguk; Park, Sangbeom; Huynh-The, Thien; Le-Ba, Vui; Amin, Muhammad Bilal; Razzaq, Muhammad Asif; Khan, Wahajat Ali; Hong, Choong Seon; Lee, Sungyoung

    2016-08-10

    There is sufficient evidence proving the impact that negative lifestyle choices have on people's health and wellness. Changing unhealthy behaviours requires raising people's self-awareness and also providing healthcare experts with a thorough and continuous description of the user's conduct. Several monitoring techniques have been proposed in the past to track users' behaviour; however, these approaches are either subjective and prone to misreporting, such as questionnaires, or only focus on a specific component of context, such as activity counters. This work presents an innovative multimodal context mining framework to inspect and infer human behaviour in a more holistic fashion. The proposed approach extends beyond the state-of-the-art, since it not only explores a sole type of context, but also combines diverse levels of context in an integral manner. Namely, low-level contexts, including activities, emotions and locations, are identified from heterogeneous sensory data through machine learning techniques. Low-level contexts are combined using ontological mechanisms to derive a more abstract representation of the user's context, here referred to as high-level context. An initial implementation of the proposed framework supporting real-time context identification is also presented. The developed system is evaluated for various realistic scenarios making use of a novel multimodal context open dataset and data on-the-go, demonstrating prominent context-aware capabilities at both low and high levels.

  7. Examining the feasibility of mixture risk assessment: A case study using a tiered approach with data of 67 pesticides from the Joint FAO/WHO Meeting on Pesticide Residues (JMPR).

    PubMed

    Evans, Richard M; Scholze, Martin; Kortenkamp, Andreas

    2015-10-01

    The way in which mixture risk assessment (MRA) should be included in chemical risk assessment is a current topic of debate. We used data from 67 recent pesticide evaluations to build a case study using Hazard Index calculations to form risk estimates in a tiered MRA approach in line with a Framework proposed by WHO/IPCS. The case study is used to illustrate the approach and to add detail to the existing Framework, and includes many more chemicals than previous case studies. A low-tier MRA identified risk as being greater than acceptable, but refining risk estimates in higher tiers was not possible due to data requirements not being readily met. Our analysis identifies data requirements, which typically expand dramatically in higher tiers, as being the likely cause for an MRA to fail in many realistic cases. This forms a major obstacle to routine implementation of MRA and shows the need for systematic generation and collection of toxicological data. In low tiers, hazard quotient inspection identifies chemicals that contribute most to the HI value and thus require attention if further refinement is needed. Implementing MRA requires consensus on issues such as scope setting, criteria for performing refinement, and decision criteria for actions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Predicting residential exposure to phthalate plasticizer emitted from vinyl flooring: a mechanistic analysis.

    PubMed

    Xu, Ying; Hubal, Elaine A Cohen; Clausen, Per A; Little, John C

    2009-04-01

    A two-room model is developed to estimate the emission rate of di-2-ethylhexyl phthalate (DEHP) from vinyl flooring and the evolving gas-phase and adsorbed surface concentrations in a realistic indoor environment. Because the DEHP emission rate measured in a test chamber may be quite different from the emission rate from the same material in the indoor environment the model provides a convenient means to predict emissions and transport in a more realistic setting. Adsorption isotherms for phthalates and plasticizers on interior surfaces, such as carpet, wood, dust, and human skin, are derived from previous field and laboratory studies. Log-linear relationships between equilibrium parameters and chemical vapor pressure are obtained. The predicted indoor air DEHP concentration at steady state is 0.15 microg/m3. Room 1 reaches steady state within about one year, while the adjacent room reaches steady state about three months later. Ventilation rate has a strong influence on DEHP emission rate while total suspended particle concentration has a substantial impact on gas-phase concentration. Exposure to DEHP via inhalation, dermal absorption, and oral ingestion of dust is evaluated. The model clarifies the mechanisms that govern the release of DEHP from vinyl flooring and the subsequent interactions with interior surfaces, airborne particles, dust, and human skin. Although further model development, parameter identification, and model validation are needed, our preliminary model provides a mechanistic framework that elucidates exposure pathways for phthalate plasticizers, and can most likely be adapted to predict emissions and transport of other semivolatile organic compounds, such as brominated flame retardants and biocides, in a residential environment.

  9. Could Expanded Freight Rail Reduce Air Pollution from Trucks?

    NASA Astrophysics Data System (ADS)

    Bickford, E. E.; Holloway, T.; Johnston, M.

    2010-12-01

    Cars, trucks and trains are a significant source of emissions that impact both climate and air quality on regional to global scales. Diesel vehicles, most used for freight transport, account for 42% of on-road nitrogen oxide emissions, 58% of on-road fine particulate emissions, and 21% of on-road carbon dioxide emissions. With freight tonnage projected to increase 28% by 2018, and freight trucks the fastest growing source of transportation emissions, we evaluate the potential for increased rail capacity to reduce the environmental impacts of trucks. Most widely available mobile source emissions inventories contain insufficient spatial detail to quantify realistic emission scenario options, and none to date have been linked with commodity flow information in a manner appropriate to consider the true potential of rail substitution. To support a truck-to-rail analysis, and other policy assessments requiring roadway-by-roadway analysis, we have developed a freight emissions inventory for the Upper Midwest based on the Federal Highway Administration’s Freight Analysis Framework version 2.2 and the Environmental Protection Agency’s on-road emissions model, Mobile6.2. Using a Geographical Information System (GIS), we developed emissions scenarios for truck-to-rail modal shifts where 95% of freight tonnage on trips longer than 400 miles is shifted off of trucks and onto railways. Scenarios will be analyzed with the Community Multiscale Air Quality (CMAQ) regional model to assess air quality impacts of associated changes. By using well-respected transportation data and realistic assumptions, results from this study have the potential to inform decisions on transportation sustainability, carbon management, public health, and air quality.

  10. Providing oxygen to children in hospitals: a realist review

    PubMed Central

    Tosif, Shidan; Gray, Amy; Qazi, Shamim; Campbell, Harry; Peel, David; McPake, Barbara; Duke, Trevor

    2017-01-01

    Abstract Objective To identify and describe interventions to improve oxygen therapy in hospitals in low-resource settings, and to determine the factors that contribute to success and failure in different contexts. Methods Using realist review methods, we scanned the literature and contacted experts in the field to identify possible mechanistic theories of how interventions to improve oxygen therapy systems might work. Then we systematically searched online databases for evaluations of improved oxygen systems in hospitals in low- or middle-income countries. We extracted data on the effectiveness, processes and underlying theory of selected projects, and used these data to test the candidate theories and identify the features of successful projects. Findings We included 20 improved oxygen therapy projects (45 papers) from 15 countries. These used various approaches to improving oxygen therapy, and reported clinical, quality of care and technical outcomes. Four effectiveness studies demonstrated positive clinical outcomes for childhood pneumonia, with large variation between programmes and hospitals. We identified factors that help or hinder success, and proposed a practical framework depicting the key requirements for hospitals to effectively provide oxygen therapy to children. To improve clinical outcomes, oxygen improvement programmes must achieve good access to oxygen and good use of oxygen, which should be facilitated by a broad quality improvement capacity, by a strong managerial and policy support and multidisciplinary teamwork. Conclusion Our findings can inform practitioners and policy-makers about how to improve oxygen therapy in low-resource settings, and may be relevant for other interventions involving the introduction of health technologies. PMID:28479624

  11. On coarse projective integration for atomic deposition in amorphous systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chuang, Claire Y., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov; Sinno, Talid, E-mail: talid@seas.upenn.edu; Han, Sang M., E-mail: yungc@seas.upenn.edu, E-mail: meister@unm.edu, E-mail: zepedaruiz1@llnl.gov

    2015-10-07

    Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of time scales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity, and computational efficiency. Coarse projective integration, an example application of the “equation-free” framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute time derivatives of slowly evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to themore » application of this technique in realistic settings is the “lifting” operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO{sub 2} substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO{sub 2} using only a few measures of the island size distribution. The approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less

  12. On Coarse Projective Integration for Atomic Deposition in Amorphous Systems

    DOE PAGES

    Chuang, Claire Y.; Han, Sang M.; Zepeda-Ruiz, Luis A.; ...

    2015-10-02

    Direct molecular dynamics simulation of atomic deposition under realistic conditions is notoriously challenging because of the wide range of timescales that must be captured. Numerous simulation approaches have been proposed to address the problem, often requiring a compromise between model fidelity, algorithmic complexity and computational efficiency. Coarse projective integration, an example application of the ‘equation-free’ framework, offers an attractive balance between these constraints. Here, periodically applied, short atomistic simulations are employed to compute gradients of slowly-evolving coarse variables that are then used to numerically integrate differential equations over relatively large time intervals. A key obstacle to the application of thismore » technique in realistic settings is the ‘lifting’ operation in which a valid atomistic configuration is recreated from knowledge of the coarse variables. Using Ge deposition on amorphous SiO 2 substrates as an example application, we present a scheme for lifting realistic atomistic configurations comprised of collections of Ge islands on amorphous SiO 2 using only a few measures of the island size distribution. In conclusion, the approach is shown to provide accurate initial configurations to restart molecular dynamics simulations at arbitrary points in time, enabling the application of coarse projective integration for this morphologically complex system.« less

  13. Investigating the organisational impacts of quality improvement: a protocol for a realist evaluation of improvement approaches drawing on the Resource Based View of the Firm

    PubMed Central

    Burton, Christopher R; Rycroft Malone, Jo; Robert, Glenn; Willson, Alan; Hopkins, Angela

    2014-01-01

    Introduction Little is understood about the role of quality improvement in enabling health organisations to survive and thrive in the contemporary context of financial and economic challenges. We will draw on the theoretical foundations of the ‘Resource Based View of the Firm’ (RBV) to develop insights into why health organisations engage in improvement work, how impacts are conceptualised, and ‘what works’ in delivering these impacts. Specifically, RBV theorises that the mix and use of resources across different organisations may explain differences in performance. Whether improvement work influences these resources is unclear. Methods and analysis Case study research will be conducted across health organisations participating in four approaches to improvement, including: a national improvement programme; a multiorganisational partnership around implementation; an organisational strategy for quality improvement; and a coproduction project designed to enhance the experience of a clinical service from the perspective of patients. Data will comprise in-depth interviews with key informants, observation of key events and documents; analysed within and then across cases. Adopting a realist perspective, the core tenets of RBV will be evaluated as a programme theory, focusing on the interplay between organisational conditions and behavioural or resource responses that are reported through engagement in improvement. Ethics and dissemination The study has been approved by Bangor University Ethics Committee. The investigation will not judge the relative merits of different approaches to healthcare quality improvement. Rather, we will develop unique insights into the organisational consequences, and dependencies of quality improvement, providing an opportunity to add to the explanatory potential of RBV in this and other contexts. In addition to scientific and lay reports of the study findings, research outputs will include a framework for constructing the economic impacts of quality improvement and practical guidance for health service managers that maximises the impacts of investment in quality improvement. PMID:25082421

  14. Integration of robotic surgery into routine practice and impacts on communication, collaboration, and decision making: a realist process evaluation protocol.

    PubMed

    Randell, Rebecca; Greenhalgh, Joanne; Hindmarsh, Jon; Dowding, Dawn; Jayne, David; Pearman, Alan; Gardner, Peter; Croft, Julie; Kotze, Alwyn

    2014-05-02

    Robotic surgery offers many potential benefits for patients. While an increasing number of healthcare providers are purchasing surgical robots, there are reports that the technology is failing to be introduced into routine practice. Additionally, in robotic surgery, the surgeon is physically separated from the patient and the rest of the team, with the potential to negatively impact teamwork in the operating theatre. The aim of this study is to ascertain: how and under what circumstances robotic surgery is effectively introduced into routine practice; and how and under what circumstances robotic surgery impacts teamwork, communication and decision making, and subsequent patient outcomes. We will undertake a process evaluation alongside a randomised controlled trial comparing laparoscopic and robotic surgery for the curative treatment of rectal cancer. Realist evaluation provides an overall framework for the study. The study will be in three phases. In Phase I, grey literature will be reviewed to identify stakeholders' theories concerning how robotic surgery becomes embedded into surgical practice and its impacts. These theories will be refined and added to through interviews conducted across English hospitals that are using robotic surgery for rectal cancer resection with staff at different levels of the organisation, along with a review of documentation associated with the introduction of robotic surgery. In Phase II, a multi-site case study will be conducted across four English hospitals to test and refine the candidate theories. Data will be collected using multiple methods: the structured observation tool OTAS (Observational Teamwork Assessment for Surgery); video recordings of operations; ethnographic observation; and interviews. In Phase III, interviews will be conducted at the four case sites with staff representing a range of surgical disciplines, to assess the extent to which the results of Phase II are generalisable and to refine the resulting theories to reflect the experience of a broader range of surgical disciplines. The study will provide (i) guidance to healthcare organisations on factors likely to facilitate successful implementation and integration of robotic surgery, and (ii) guidance on how to ensure effective communication and teamwork when undertaking robotic surgery.

  15. Integration of robotic surgery into routine practice and impacts on communication, collaboration, and decision making: a realist process evaluation protocol

    PubMed Central

    2014-01-01

    Background Robotic surgery offers many potential benefits for patients. While an increasing number of healthcare providers are purchasing surgical robots, there are reports that the technology is failing to be introduced into routine practice. Additionally, in robotic surgery, the surgeon is physically separated from the patient and the rest of the team, with the potential to negatively impact teamwork in the operating theatre. The aim of this study is to ascertain: how and under what circumstances robotic surgery is effectively introduced into routine practice; and how and under what circumstances robotic surgery impacts teamwork, communication and decision making, and subsequent patient outcomes. Methods and design We will undertake a process evaluation alongside a randomised controlled trial comparing laparoscopic and robotic surgery for the curative treatment of rectal cancer. Realist evaluation provides an overall framework for the study. The study will be in three phases. In Phase I, grey literature will be reviewed to identify stakeholders’ theories concerning how robotic surgery becomes embedded into surgical practice and its impacts. These theories will be refined and added to through interviews conducted across English hospitals that are using robotic surgery for rectal cancer resection with staff at different levels of the organisation, along with a review of documentation associated with the introduction of robotic surgery. In Phase II, a multi-site case study will be conducted across four English hospitals to test and refine the candidate theories. Data will be collected using multiple methods: the structured observation tool OTAS (Observational Teamwork Assessment for Surgery); video recordings of operations; ethnographic observation; and interviews. In Phase III, interviews will be conducted at the four case sites with staff representing a range of surgical disciplines, to assess the extent to which the results of Phase II are generalisable and to refine the resulting theories to reflect the experience of a broader range of surgical disciplines. The study will provide (i) guidance to healthcare organisations on factors likely to facilitate successful implementation and integration of robotic surgery, and (ii) guidance on how to ensure effective communication and teamwork when undertaking robotic surgery. PMID:24885669

  16. Development and validation of a metal mixture bioavailability model (MMBM) to predict chronic toxicity of Ni-Zn-Pb mixtures to Ceriodaphnia dubia.

    PubMed

    Nys, Charlotte; Janssen, Colin R; De Schamphelaere, Karel A C

    2017-01-01

    Recently, several bioavailability-based models have been shown to predict acute metal mixture toxicity with reasonable accuracy. However, the application of such models to chronic mixture toxicity is less well established. Therefore, we developed in the present study a chronic metal mixture bioavailability model (MMBM) by combining the existing chronic daphnid bioavailability models for Ni, Zn, and Pb with the independent action (IA) model, assuming strict non-interaction between the metals for binding at the metal-specific biotic ligand sites. To evaluate the predictive capacity of the MMBM, chronic (7d) reproductive toxicity of Ni-Zn-Pb mixtures to Ceriodaphnia dubia was investigated in four different natural waters (pH range: 7-8; Ca range: 1-2 mM; Dissolved Organic Carbon range: 5-12 mg/L). In each water, mixture toxicity was investigated at equitoxic metal concentration ratios as well as at environmental (i.e. realistic) metal concentration ratios. Statistical analysis of mixture effects revealed that observed interactive effects depended on the metal concentration ratio investigated when evaluated relative to the concentration addition (CA) model, but not when evaluated relative to the IA model. This indicates that interactive effects observed in an equitoxic experimental design cannot always be simply extrapolated to environmentally realistic exposure situations. Generally, the IA model predicted Ni-Zn-Pb mixture toxicity more accurately than the CA model. Overall, the MMBM predicted Ni-Zn-Pb mixture toxicity (expressed as % reproductive inhibition relative to a control) in 85% of the treatments with less than 20% error. Moreover, the MMBM predicted chronic toxicity of the ternary Ni-Zn-Pb mixture at least equally accurately as the toxicity of the individual metal treatments (RMSE Mix  = 16; RMSE Zn only  = 18; RMSE Ni only  = 17; RMSE Pb only  = 23). Based on the present study, we believe MMBMs can be a promising tool to account for the effects of water chemistry on metal mixture toxicity during chronic exposure and could be used in metal risk assessment frameworks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Evaluation of critical nuclear power plant electrical cable response to severe thermal fire conditions

    NASA Astrophysics Data System (ADS)

    Taylor, Gabriel James

    The failure of electrical cables exposed to severe thermal fire conditions are a safety concern for operating commercial nuclear power plants (NPPs). The Nuclear Regulatory Commission (NRC) has promoted the use of risk-informed and performance-based methods for fire protection which resulted in a need to develop realistic methods to quantify the risk of fire to NPP safety. Recent electrical cable testing has been conducted to provide empirical data on the failure modes and likelihood of fire-induced damage. This thesis evaluated numerous aspects of the data. Circuit characteristics affecting fire-induced electrical cable failure modes have been evaluated. In addition, thermal failure temperatures corresponding to cable functional failures have been evaluated to develop realistic single point thermal failure thresholds and probability distributions for specific cable insulation types. Finally, the data was used to evaluate the prediction capabilities of a one-dimension conductive heat transfer model used to predict cable failure.

  18. An ice sheet model validation framework for the Greenland ice sheet.

    PubMed

    Price, Stephen F; Hoffman, Matthew J; Bonin, Jennifer A; Howat, Ian M; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P; Evans, Katherine J; Kennedy, Joseph H; Lenaerts, Jan; Lipscomb, William H; Perego, Mauro; Salinger, Andrew G; Tuminaro, Raymond S; van den Broeke, Michiel R; Nowicki, Sophie M J

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  19. The LSST Metrics Analysis Framework (MAF)

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Zeljko; Krughoff, K. Simon; Petry, Catherine E.; Ridgway, Stephen T.

    2015-01-01

    Studying potential observing strategies or cadences for the Large Synoptic Survey Telescope (LSST) is a complicated but important problem. To address this, LSST has created an Operations Simulator (OpSim) to create simulated surveys, including realistic weather and sky conditions. Analyzing the results of these simulated surveys for the wide variety of science cases to be considered for LSST is, however, difficult. We have created a Metric Analysis Framework (MAF), an open-source python framework, to be a user-friendly, customizable and easily extensible tool to help analyze the outputs of the OpSim.MAF reads the pointing history of the LSST generated by the OpSim, then enables the subdivision of these pointings based on position on the sky (RA/Dec, etc.) or the characteristics of the observations (e.g. airmass or sky brightness) and a calculation of how well these observations meet a specified science objective (or metric). An example simple metric could be the mean single visit limiting magnitude for each position in the sky; a more complex metric might be the expected astrometric precision. The output of these metrics can be generated for a full survey, for specified time intervals, or for regions of the sky, and can be easily visualized using a web interface.An important goal for MAF is to facilitate analysis of the OpSim outputs for a wide variety of science cases. A user can often write a new metric to evaluate OpSim for new science goals in less than a day once they are familiar with the framework. Some of these new metrics are illustrated in the accompanying poster, "Analyzing Simulated LSST Survey Performance With MAF".While MAF has been developed primarily for application to OpSim outputs, it can be applied to any dataset. The most obvious examples are examining pointing histories of other survey projects or telescopes, such as CFHT.

  20. Use of a plastic eraser for ear reconstruction training.

    PubMed

    Erdogan, Basar; Morioka, Daichi; Hamada, Taishi; Kusano, Taro; Win, Khin Malar

    2018-01-01

    Microtia reconstruction is a challenging procedure, especially in developing nations. The most complex part is learning how to fabricate a framework from costal cartilage. We herein propose a training regimen for ear reconstruction with the use of a plastic eraser. The texture of a plastic eraser made from polyvinyl chloride is similar to that of human costal cartilage. The first step of the training is carving out the sixth through eighth rib cartilages from a block of plastic eraser. The second step is a fabrication of the framework from plastic rib cartilages, referring to a template from the intact auricle. As plastic erasers are inexpensive and universally available, inexperienced surgeons can repeatedly perform this framework training. Following several of these training sessions in developing nations, the co-authors and local surgeons successfully performed their microtia reconstructions in a reasonable operative time. This realistic carving model allows surgeons to gain experience before performing an actual ear reconstruction, even in resource-constrained circumstances.

  1. Cloud Feedbacks on Greenhouse Warming in a Multi-Scale Modeling Framework with a Higher-Order Turbulence Closure

    NASA Technical Reports Server (NTRS)

    Cheng, Anning; Xu, Kuan-Man

    2015-01-01

    Five-year simulation experiments with a multi-scale modeling Framework (MMF) with a advanced intermediately prognostic higher-order turbulence closure (IPHOC) in its cloud resolving model (CRM) component, also known as SPCAM-IPHOC (super parameterized Community Atmospheric Model), are performed to understand the fast tropical (30S-30N) cloud response to an instantaneous doubling of CO2 concentration with SST held fixed at present-day values. SPCAM-IPHOC has substantially improved the low-level representation compared with SPCAM. It is expected that the cloud responses to greenhouse warming in SPCAM-IPHOC is more realistic. The change of rising motion, surface precipitation, cloud cover, and shortwave and longwave cloud radiative forcing in SPCAM-IPHOC from the greenhouse warming will be presented in the presentation.

  2. Anisotropic neutron stars in R2 gravity

    NASA Astrophysics Data System (ADS)

    Folomeev, Vladimir

    2018-06-01

    We consider static neutron stars within the framework of R2 gravity. The neutron fluid is described by three different types of realistic equations of state (soft, moderately stiff, and stiff). Using the observational data on the neutron star mass-radius relation, it is demonstrated that the characteristics of the objects supported by the isotropic fluid agree with the observations only if one uses the soft equation of state. We show that the inclusion of the fluid anisotropy enables one also to employ more stiff equations of state to model configurations that will satisfy the observational constraints sufficiently. Also, using the standard thin accretion disk model, we demonstrate potentially observable differences, which allow us to distinguish the neutron stars constructed within the modified gravity framework from those described in Einstein's general relativity.

  3. A stochastic agent-based model of pathogen propagation in dynamic multi-relational social networks

    PubMed Central

    Khan, Bilal; Dombrowski, Kirk; Saad, Mohamed

    2015-01-01

    We describe a general framework for modeling and stochastic simulation of epidemics in realistic dynamic social networks, which incorporates heterogeneity in the types of individuals, types of interconnecting risk-bearing relationships, and types of pathogens transmitted across them. Dynamism is supported through arrival and departure processes, continuous restructuring of risk relationships, and changes to pathogen infectiousness, as mandated by natural history; dynamism is regulated through constraints on the local agency of individual nodes and their risk behaviors, while simulation trajectories are validated using system-wide metrics. To illustrate its utility, we present a case study that applies the proposed framework towards a simulation of HIV in artificial networks of intravenous drug users (IDUs) modeled using data collected in the Social Factors for HIV Risk survey. PMID:25859056

  4. Quantum interval-valued probability: Contextuality and the Born rule

    NASA Astrophysics Data System (ADS)

    Tai, Yu-Tsung; Hanson, Andrew J.; Ortiz, Gerardo; Sabry, Amr

    2018-05-01

    We present a mathematical framework based on quantum interval-valued probability measures to study the effect of experimental imperfections and finite precision measurements on defining aspects of quantum mechanics such as contextuality and the Born rule. While foundational results such as the Kochen-Specker and Gleason theorems are valid in the context of infinite precision, they fail to hold in general in a world with limited resources. Here we employ an interval-valued framework to establish bounds on the validity of those theorems in realistic experimental environments. In this way, not only can we quantify the idea of finite-precision measurement within our theory, but we can also suggest a possible resolution of the Meyer-Mermin debate on the impact of finite-precision measurement on the Kochen-Specker theorem.

  5. Constrained optimization framework for interface-aware sub-scale dynamics models for voids closure in Lagrangian hydrodynamics

    DOE PAGES

    Barlow, Andrew; Klima, Matej; Shashkov, Mikhail

    2018-04-02

    In hydrocodes, voids are used to represent vacuum and model free boundaries between vacuum and real materials. We give a systematic description of a new treatment of void closure in the framework of the multimaterial arbitrary Lagrangian–Eulerian (ALE) methods. This includes a new formulation of the interface-aware sub-scale-dynamics (IA-SSD) closure model for multimaterial cells with voids, which is used in the Lagrangian stage of our indirect ALE scheme. The results of the comprehensive testing of the new model are presented for one- and two-dimensional multimaterial calculations in the presence of voids. Finally, we also present a sneak peek of amore » realistic shaped charge calculation in the presence of voids and solids.« less

  6. Constrained optimization framework for interface-aware sub-scale dynamics models for voids closure in Lagrangian hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barlow, Andrew; Klima, Matej; Shashkov, Mikhail

    In hydrocodes, voids are used to represent vacuum and model free boundaries between vacuum and real materials. We give a systematic description of a new treatment of void closure in the framework of the multimaterial arbitrary Lagrangian–Eulerian (ALE) methods. This includes a new formulation of the interface-aware sub-scale-dynamics (IA-SSD) closure model for multimaterial cells with voids, which is used in the Lagrangian stage of our indirect ALE scheme. The results of the comprehensive testing of the new model are presented for one- and two-dimensional multimaterial calculations in the presence of voids. Finally, we also present a sneak peek of amore » realistic shaped charge calculation in the presence of voids and solids.« less

  7. Auditory steady state responses and cochlear implants: Modeling the artifact-response mixture in the perspective of denoising

    PubMed Central

    Mina, Faten; Attina, Virginie; Duroc, Yvan; Veuillet, Evelyne; Truy, Eric; Thai-Van, Hung

    2017-01-01

    Auditory steady state responses (ASSRs) in cochlear implant (CI) patients are contaminated by the spread of a continuous CI electrical stimulation artifact. The aim of this work was to model the electrophysiological mixture of the CI artifact and the corresponding evoked potentials on scalp electrodes in order to evaluate the performance of denoising algorithms in eliminating the CI artifact in a controlled environment. The basis of the proposed computational framework is a neural mass model representing the nodes of the auditory pathways. Six main contributors to auditory evoked potentials from the cochlear level and up to the auditory cortex were taken into consideration. The simulated dynamics were then projected into a 3-layer realistic head model. 32-channel scalp recordings of the CI artifact-response were then generated by solving the electromagnetic forward problem. As an application, the framework’s simulated 32-channel datasets were used to compare the performance of 4 commonly used Independent Component Analysis (ICA) algorithms: infomax, extended infomax, jade and fastICA in eliminating the CI artifact. As expected, two major components were detectable in the simulated datasets, a low frequency component at the modulation frequency and a pulsatile high frequency component related to the stimulation frequency. The first can be attributed to the phase-locked ASSR and the second to the stimulation artifact. Among the ICA algorithms tested, simulations showed that infomax was the most efficient and reliable in denoising the CI artifact-response mixture. Denoising algorithms can induce undesirable deformation of the signal of interest in real CI patient recordings. The proposed framework is a valuable tool for evaluating these algorithms in a controllable environment ahead of experimental or clinical applications. PMID:28350887

  8. VIC-CropSyst-v2: A regional-scale modeling platform to simulate the nexus of climate, hydrology, cropping systems, and human decisions

    NASA Astrophysics Data System (ADS)

    Malek, Keyvan; Stöckle, Claudio; Chinnayakanahalli, Kiran; Nelson, Roger; Liu, Mingliang; Rajagopalan, Kirti; Barik, Muhammad; Adam, Jennifer C.

    2017-08-01

    Food supply is affected by a complex nexus of land, atmosphere, and human processes, including short- and long-term stressors (e.g., drought and climate change, respectively). A simulation platform that captures these complex elements can be used to inform policy and best management practices to promote sustainable agriculture. We have developed a tightly coupled framework using the macroscale variable infiltration capacity (VIC) hydrologic model and the CropSyst agricultural model. A mechanistic irrigation module was also developed for inclusion in this framework. Because VIC-CropSyst combines two widely used and mechanistic models (for crop phenology, growth, management, and macroscale hydrology), it can provide realistic and hydrologically consistent simulations of water availability, crop water requirements for irrigation, and agricultural productivity for both irrigated and dryland systems. This allows VIC-CropSyst to provide managers and decision makers with reliable information on regional water stresses and their impacts on food production. Additionally, VIC-CropSyst is being used in conjunction with socioeconomic models, river system models, and atmospheric models to simulate feedback processes between regional water availability, agricultural water management decisions, and land-atmosphere interactions. The performance of VIC-CropSyst was evaluated on both regional (over the US Pacific Northwest) and point scales. Point-scale evaluation involved using two flux tower sites located in agricultural fields in the US (Nebraska and Illinois). The agreement between recorded and simulated evapotranspiration (ET), applied irrigation water, soil moisture, leaf area index (LAI), and yield indicated that, although the model is intended to work on regional scales, it also captures field-scale processes in agricultural areas.

  9. Conceptualizing and Managing Medical Emergencies Where No Formal Paramedical System Exists: Perspectives from a Remote Indigenous Community in Canada.

    PubMed

    Curran, Jeffrey; Ritchie, Stephen D; Beardy, Jackson; VanderBurgh, David; Born, Karen; Lewko, John; Orkin, Aaron M

    2018-02-04

    (1) Background: Remote communities in Canada lack an equitable emergency medical response capacity compared to other communities. Community-based emergency care (CBEC) training for laypeople is a model that has the potential to enhance the medical emergency response capacity in isolated and resource-limited contexts. The purpose of this study was to understand the characteristics of medical emergencies and to conceptualize and present a framework for what a medical emergency is for one remote Indigenous community in northwestern Ontario, in order to inform the development of CBEC training. (2) Methods: This study adhered to the principles of community-based participatory research and realist evaluation; it was an integrated component of the formative evaluation of the second Sachigo Lake Wilderness Emergency Response Education Initiative (SLWEREI) training course in 2012. Twelve members of Sachigo Lake First Nation participated in the training course, along with local nursing staff, police officers, community Elders, and course instructors (n = 24 total), who participated in interviews, focus groups, and a collaborative discussion of local health issues in the development of the SLWEREI. (3) Results: The qualitative results are organized into sections that describe the types of local health emergencies and the informal response system of community members in addressing these emergencies. Prominent themes of health adversity that emerged were an inability to manage chronic conditions and fears of exacerbations, the lack of capacity for addressing mental illness, and the high prevalence of injury for community members. (4) Discussion: A three-point framework of what constitutes local perceptions of an emergency emerged from the findings in this study: (1) a sense of isolation; (2) a condition with a potentially adverse outcome; and (3) a need for help.

  10. Estimation of Distributed Groundwater Pumping Rates in Yolo County,CA—Intercomparison of Two Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Maples, S.; Fogg, G. E.; Harter, T.

    2015-12-01

    Accurate estimation of groundwater (GW) budgets and effective management of agricultural GW pumping remains a challenge in much of California's Central Valley (CV) due to a lack of irrigation well metering. CVHM and C2VSim are two regional-scale integrated hydrologic models that provide estimates of historical and current CV distributed pumping rates. However, both models estimate GW pumping using conceptually different agricultural water models with uncertainties that have not been adequately investigated. Here, we evaluate differences in distributed agricultural GW pumping and recharge estimates related to important differences in the conceptual framework and model assumptions used to simulate surface water (SW) and GW interaction across the root zone. Differences in the magnitude and timing of GW pumping and recharge were evaluated for a subregion (~1000 mi2) coincident with Yolo County, CA, to provide similar initial and boundary conditions for both models. Synthetic, multi-year datasets of land-use, precipitation, evapotranspiration (ET), and SW deliveries were prescribed for each model to provide realistic end-member scenarios for GW-pumping demand and recharge. Results show differences in the magnitude and timing of GW-pumping demand, deep percolation, and recharge. Discrepancies are related, in large part, to model differences in the estimation of ET requirements and representation of soil-moisture conditions. CVHM partitions ET demand, while C2VSim uses a bulk ET rate, resulting in differences in both crop-water and GW-pumping demand. Additionally, CVHM assumes steady-state soil-moisture conditions, and simulates deep percolation as a function of irrigation inefficiencies, while C2VSim simulates deep percolation as a function of transient soil-moisture storage conditions. These findings show that estimates of GW-pumping demand are sensitive to these important conceptual differences, which can impact conjunctive-use water management decisions in the CV.

  11. Conceptualizing and Managing Medical Emergencies Where No Formal Paramedical System Exists: Perspectives from a Remote Indigenous Community in Canada

    PubMed Central

    Curran, Jeffrey; Ritchie, Stephen D.; Beardy, Jackson; VanderBurgh, David; Born, Karen; Lewko, John; Orkin, Aaron M.

    2018-01-01

    (1) Background: Remote communities in Canada lack an equitable emergency medical response capacity compared to other communities. Community-based emergency care (CBEC) training for laypeople is a model that has the potential to enhance the medical emergency response capacity in isolated and resource-limited contexts. The purpose of this study was to understand the characteristics of medical emergencies and to conceptualize and present a framework for what a medical emergency is for one remote Indigenous community in northwestern Ontario, in order to inform the development of CBEC training. (2) Methods: This study adhered to the principles of community-based participatory research and realist evaluation; it was an integrated component of the formative evaluation of the second Sachigo Lake Wilderness Emergency Response Education Initiative (SLWEREI) training course in 2012. Twelve members of Sachigo Lake First Nation participated in the training course, along with local nursing staff, police officers, community Elders, and course instructors (n = 24 total), who participated in interviews, focus groups, and a collaborative discussion of local health issues in the development of the SLWEREI. (3) Results: The qualitative results are organized into sections that describe the types of local health emergencies and the informal response system of community members in addressing these emergencies. Prominent themes of health adversity that emerged were an inability to manage chronic conditions and fears of exacerbations, the lack of capacity for addressing mental illness, and the high prevalence of injury for community members. (4) Discussion: A three-point framework of what constitutes local perceptions of an emergency emerged from the findings in this study: (1) a sense of isolation; (2) a condition with a potentially adverse outcome; and (3) a need for help. PMID:29401706

  12. Towards a Cognitively Realistic Computational Model of Team Problem Solving Using ACT-R Agents and the ELICIT Experimentation Framework

    DTIC Science & Technology

    2014-06-01

    intelligence analysis processes. However, as has been noted in previous work (e.g., [42]), there are a number of important differences between the nature of the...problem encountered in the context of the ELICIT task and the problems dealt with by intelligence analysts. Perhaps most importantly, the fact that a...see Section 7). 6 departure from the reality of most intelligence analysis situations: in most real-world intelligence analysis problems agents have

  13. Tsunami and acoustic-gravity waves in water of constant depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendin, Gali; Stiassnie, Michael

    2013-08-15

    A study of wave radiation by a rather general bottom displacement, in a compressible ocean of otherwise constant depth, is carried out within the framework of a three-dimensional linear theory. Simple analytic expressions for the flow field, at large distance from the disturbance, are derived. Realistic numerical examples indicate that the Acoustic-Gravity waves, which significantly precede the Tsunami, are expected to leave a measurable signature on bottom-pressure records that should be considered for early detection of Tsunami.

  14. Invariant Tori in the Secular Motions of the Three-body Planetary Systems

    NASA Astrophysics Data System (ADS)

    Locatelli, Ugo; Giorgilli, Antonio

    We consider the problem of the applicability of KAM theorem to a realistic problem of three bodies. In the framework of the averaged dynamics over the fast angles for the Sun-Jupiter-Saturn system we can prove the perpetual stability of the orbit. The proof is based on semi-numerical algorithms requiring both explicit algebraic manipulations of series and analytical estimates. The proof is made rigorous by using interval arithmetics in order to control the numerical errors.

  15. An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.

    PubMed

    Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D

    2016-05-01

    Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be avoided. Data, extraction algorithms and evaluation routines were released as part of the fecgsyn toolbox on Physionet under an GNU GPL open-source license. This contribution provides a standard framework for benchmarking and regulatory testing of NI-FECG extraction algorithms.

  16. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram

    PubMed Central

    2015-01-01

    Objectives Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. Methods This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. Results It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. Conclusions To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions. PMID:26618028

  17. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram.

    PubMed

    Chang, Hyejung

    2015-10-01

    Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions.

  18. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  19. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  20. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  1. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  2. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  3. Evaluating the implementation of a national clinical programme for diabetes to standardise and improve services: a realist evaluation protocol.

    PubMed

    McHugh, S; Tracey, M L; Riordan, F; O'Neill, K; Mays, N; Kearney, P M

    2016-07-28

    Over the last three decades in response to the growing burden of diabetes, countries worldwide have developed national and regional multifaceted programmes to improve the monitoring and management of diabetes and to enhance the coordination of care within and across settings. In Ireland in 2010, against a backdrop of limited dedicated strategic planning and engrained variation in the type and level of diabetes care, a national programme was established to standardise and improve care for people with diabetes in Ireland, known as the National Diabetes Programme (NDP). The NDP comprises a range of organisational and service delivery changes to support evidence-based practices and policies. This realist evaluation protocol sets out the approach that will be used to identify and explain which aspects of the programme are working, for whom and in what circumstances to produce the outcomes intended. This mixed method realist evaluation will develop theories about the relationship between the context, mechanisms and outcomes of the diabetes programme. In stage 1, to identify the official programme theories, documentary analysis and qualitative interviews were conducted with national stakeholders involved in the design, development and management of the programme. In stage 2, as part of a multiple case study design with one case per administrative region in the health system, qualitative interviews are being conducted with frontline staff and service users to explore their responses to, and reasoning about, the programme's resources (mechanisms). Finally, administrative data will be used to examine intermediate implementation outcomes such as service uptake, acceptability, and fidelity to models of care. This evaluation is using the principles of realist evaluation to examine the implementation of a national programme to standardise and improve services for people with diabetes in Ireland. The concurrence of implementation and evaluation has enabled us to produce formative feedback for the NDP while also supporting the refinement and revision of initial theories about how the programme is being implemented in the dynamic and unstable context of the Irish healthcare system.

  4. Introducing concurrency in the Gaudi data processing framework

    NASA Astrophysics Data System (ADS)

    Clemencic, Marco; Hegner, Benedikt; Mato, Pere; Piparo, Danilo

    2014-06-01

    In the past, the increasing demands for HEP processing resources could be fulfilled by the ever increasing clock-frequencies and by distributing the work to more and more physical machines. Limitations in power consumption of both CPUs and entire data centres are bringing an end to this era of easy scalability. To get the most CPU performance per watt, future hardware will be characterised by less and less memory per processor, as well as thinner, more specialized and more numerous cores per die, and rather heterogeneous resources. To fully exploit the potential of the many cores, HEP data processing frameworks need to allow for parallel execution of reconstruction or simulation algorithms on several events simultaneously. We describe our experience in introducing concurrency related capabilities into Gaudi, a generic data processing software framework, which is currently being used by several HEP experiments, including the ATLAS and LHCb experiments at the LHC. After a description of the concurrent framework and the most relevant design choices driving its development, we describe the behaviour of the framework in a more realistic environment, using a subset of the real LHCb reconstruction workflow, and present our strategy and the used tools to validate the physics outcome of the parallel framework against the results of the present, purely sequential LHCb software. We then summarize the measurement of the code performance of the multithreaded application in terms of memory and CPU usage.

  5. Using coupled hydrogeophysical models and data assimilation to enhance the information content in geoelectrical leak detection

    NASA Astrophysics Data System (ADS)

    Tso, C. H. M.; Johnson, T. C.; Song, X.; Chen, X.; Binley, A. M.

    2017-12-01

    Time-lapse electrical resistivity tomography (ERT) measurements provides indirect observation of hydrological processes in the Earth's shallow subsurface at high spatial and temporal resolutions. ERT has been used for a number of decades to detect leaks and monitor the evolution of associated contaminant plumes. However, this has been limited to a few hazardous environmental sites. Furthermore, an assessment of uncertainty in such applications has thus far been neglected, despite the clear need to provide site managers with appropriate information for decision making purposes. There is a need to establish a framework that allows leak detection with uncertainty assessment from geophysical observations. Ideally such a framework should allow the incorporation of additional data sources in order to reduce uncertainty in predictions. To tackle these issues, we propose an ensemble-based data assimilation framework that evaluates proposed hydrological models (i.e. different hydrogeological units, different leak locations and loads) against observed time-lapse ERT measurements. Each proposed hydrological model is run through the parallel coupled hydrogeophysical code PFLOTRAN-E4D (Johnson et al 2016) to obtain simulated ERT measurements. The ensemble of model proposals is then updated based on data misfit. Our approach does not focus on obtaining detailed images of hydraulic properties or plume movement. Rather, it seeks to estimate the contaminant mass discharge (CMD) across a user-defined plane in space probabilistically. The proposed approach avoids the ambiguity in interpreting detailed hydrological processes from geophysical images. The resultant distributions of CMD give a straightforward metric, with realistic uncertainty bounds, for decision making. The proposed framework is also computationally efficient so that it can exploit large, long-term ERT datasets, making it possible to track time-varying loadings of plume sources. In this presentation, we illustrate our framework on synthetic data and field data collected from an ERT trial simulating a leak at the Sellafield nuclear facility in the UK (Kuras et al 2016). We compare our results to interpretation from geophysical inversion and discuss the additional information that hydrological model proposals provide.

  6. Understanding the motivation and performance of community health volunteers involved in the delivery of health programmes in Kampala, Uganda: a realist evaluation protocol

    PubMed Central

    Vareilles, Gaëlle; Pommier, Jeanine; Kane, Sumit; Pictet, Gabriel; Marchal, Bruno

    2015-01-01

    Introduction The recruitment of community health volunteers to support the delivery of health programmes is a well-established approach in many countries, particularly where health services are not readily available. However, studies on management of volunteers are scarce and current research on human resource management of volunteers faces methodological challenges. This paper presents the protocol of a realist evaluation that aims at identifying the factors influencing the performance of community health volunteers involved in the delivery of a Red Cross immunisation programme in Kampala (Uganda) with a specific focus on motivation. Methods and analysis The realist evaluation cycle structures the protocol. To develop the theoretical basis for the evaluation, the authors conducted interviews and reviewed the literature on community health volunteers’ performance, management and organisational behaviour. This led to the formulation of the initial programme theory, which links the intervention inputs (capacity-building strategies) to the expected outcomes (positive work behaviour) with mechanisms that point in the direction of drivers of motivation. The contextual elements include components such as organisational culture, resource availability, etc. A case study design will be adopted. We define a case as a Red Cross branch, run by a programme manager, and will select two cases at the district level in Kampala. Mixed methods will be used in data collection, including individual interviews of volunteers, participant observation and document review. The thematic analysis will be based on the initial programme theory and will seek for context-mechanism-outcome configurations. Findings from the two cases will be compared. Discussion We discuss the scope for applying realist evaluation and the methodological challenges we encountered in developing this protocol. Ethics and dissemination The study was approved by the Ethical Committee at Rennes University Hospital, France. Results will be published in scientific journals, and communicated to respondents and relevant institutions. PMID:25631314

  7. Improved harmonisation from policy dialogue? Realist perspectives from Guinea and Chad.

    PubMed

    Kwamie, Aku; Nabyonga-Orem, Juliet

    2016-07-18

    Harmonisation is a key principle of the Paris Declaration. The Universal Health Coverage (UHC) Partnership, an initiative of the European Union, the Government of Luxembourg and the World Health Organization, supported health policy dialogues between 2012 and 2015 in identified countries in the WHO African Region. The UHC Partnership has amongst its key objectives to strengthen national health policy development. In Guinea and Chad, policy dialogue focused on elaborating the national health plan and other key documents. This study is an analytical reflection inspired by realist evaluative approaches to understand whether policy dialogue led to improved harmonisation amongst health actors in Guinea and Chad, and if so, how and why. Interviews were conducted in Guinea and Chad with key informants at the national and sub-national government levels, civil society, and development partners. A review of relevant policy documents and reports was added to data collection to construct a full picture of the policy dialogue process. Context-mechanism-outcome configurations were used as the realist framework to guide the analysis on how participants' understanding of what policy dialogue was and the way the policy dialogue process unfolded led to improved harmonisation. Improved harmonisation as a result of policy dialogue was perceived to be stronger in Guinea than in Chad. While in both countries the participants held a shared view of what policy dialogue was and what it could achieve, and both policy dialogue processes were considered to be well implemented (i.e., well-facilitated, evidence-based, participatory, and consisted of recurring meetings and activities), certain contextual factors in Chad tempered the view of harmonisation as having improved. These were the pre-existence of dialogic policy processes that had exposed the actors to the potential that policy dialogue could have; a focus on elaborating provincial level strategies, which gave the sense that the process was more bottom-up; and the perception that there were acute resource constraints, which conditioned partners' interactions. Policy dialogue improves harmonisation in terms of fostering information exchange amongst partners; however, it does not appear to influence the operational procedures of the actors. This has implications for aid effectiveness.

  8. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes.

    PubMed

    Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam

    2012-02-01

    To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. Paediatric residency program at BC Children's Hospital, Vancouver, British Columbia. The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes.

  9. Searching for the mechanisms of change: a protocol for a realist review of batterer treatment programmes

    PubMed Central

    Cheff, Rebecca; Finn, Debbie; Davloor, Whitney; O'Campo, Patricia

    2016-01-01

    Introduction Conflicting results reported by evaluations of typical batterer intervention programmes leave many judicial officials and policymakers uncertain about the best way to respond to domestic violence, and whether to recommend and fund these programmes. Traditional evaluations and systematic reviews tend to focus predominantly on whether the programmes ‘worked’ (eg, reduced recidivism) often at the exclusion of understanding for whom they may or may not have worked, under what circumstances, and why. Methods and analysis We are undertaking a realist review of the batterer treatment programme literature with the aim of addressing this gap. Keeping with the goals of realist review, our primary aims are to identify the theory that underlies these programmes, highlight the mechanisms that trigger changes in participant behaviour and finally explain why these programmes help some individuals reduce their use of violence and under what conditions they are effective or not effective. We begin by describing the process of perpetrator treatment, and by proposing an initial theoretical model of behaviour change that will be tested by our review. We then describe the criteria for inclusion of an evaluation into the review, the search strategy we will use to identify the studies, and the plan for data extraction and analysis. Ethics and dissemination The results of this review will be written up using the RAMESES Guidelines for Realist Synthesis, and disseminated through peer-reviewed publications aimed at the practitioner community as well as presented at community forums, and at violence against women conferences. Ethics approval was not needed. PMID:27053268

  10. Constaints on Lorentz symmetry violations using lunar laser ranging observations

    NASA Astrophysics Data System (ADS)

    Bourgoin, Adrien

    2016-12-01

    General Relativity (GR) and the standard model of particle physics provide a comprehensive description of the four interactions of nature. A quantum gravity theory is expected to merge these two pillars of modern physics. From unification theories, such a combination would lead to a breaking of fundamental symmetry appearing in both GR and the standard model of particle physics as the Lorentz symmetry. Lorentz symmetry violations in all fields of physics can be parametrized by an effective field theory framework called the standard-model extension (SME). Local Lorentz Invariance violations in the gravitational sector should impact the orbital motion of bodies inside the solar system, such as the Moon. Thus, the accurate lunar laser ranging (LLR) data can be analyzed in order to study precisely the lunar motion to look for irregularities. For this purpose, ELPN (Ephéméride Lunaire Parisienne Numérique), a new lunar ephemeris has been integrated in the SME framework. This new numerical solution of the lunar motion provides time series dated in temps dynamique barycentrique (TDB). Among that series, we mention the barycentric position and velocity of the Earth-Moon vector, the lunar libration angles, the time scale difference between the terrestrial time and TDB and partial derivatives integrated from variational equations. ELPN predictions have been used to analyzed LLR observations. In the GR framework, the residuals standard deviations has turned out to be the same order of magnitude compare to those of INPOP13b and DE430 ephemerides. In the framework of the minimal SME, LLR data analysis provided constraints on local Lorentz invariance violations. Spetial attention was paid to analyze uncertainties to provide the most realistic constraints. Therefore, in a first place, linear combinations of SME coefficients have been derived and fitted to LLR observations. In a second time, realistic uncertainties have been determined with a resampling method. LLR data analysis did not reveal local Lorentz invariance vio lations arising on the lunar orbit. Therefore, GR predictions are recovered with absolute precisions of the order of 10-9 to 10-12.

  11. Protocol for process evaluation of a randomised controlled trial of family-led rehabilitation post stroke (ATTEND) in India

    PubMed Central

    Liu, Hueiming; Lindley, Richard; Alim, Mohammed; Felix, Cynthia; Gandhi, Dorcas B C; Verma, Shweta J; Tugnawat, Deepak Kumar; Syrigapu, Anuradha; Ramamurthy, Ramaprabhu Krishnappa; Pandian, Jeyaraj D; Walker, Marion; Forster, Anne; Anderson, Craig S; Langhorne, Peter; Murthy, Gudlavalleti Venkata Satyanarayana; Shamanna, Bindiganavale Ramaswamy; Hackett, Maree L; Maulik, Pallab K; Harvey, Lisa A; Jan, Stephen

    2016-01-01

    Introduction We are undertaking a randomised controlled trial (fAmily led rehabiliTaTion aftEr stroke in INDia, ATTEND) evaluating training a family carer to enable maximal rehabilitation of patients with stroke-related disability; as a potentially affordable, culturally acceptable and effective intervention for use in India. A process evaluation is needed to understand how and why this complex intervention may be effective, and to capture important barriers and facilitators to its implementation. We describe the protocol for our process evaluation to encourage the development of in-process evaluation methodology and transparency in reporting. Methods and analysis The realist and RE-AIM (Reach, Effectiveness, Adoption, Implementation and Maintenance) frameworks informed the design. Mixed methods include semistructured interviews with health providers, patients and their carers, analysis of quantitative process data describing fidelity and dose of intervention, observations of trial set up and implementation, and the analysis of the cost data from the patients and their families perspective and programme budgets. These qualitative and quantitative data will be analysed iteratively prior to knowing the quantitative outcomes of the trial, and then triangulated with the results from the primary outcome evaluation. Ethics and dissemination The process evaluation has received ethical approval for all sites in India. In low-income and middle-income countries, the available human capital can form an approach to reducing the evidence practice gap, compared with the high cost alternatives available in established market economies. This process evaluation will provide insights into how such a programme can be implemented in practice and brought to scale. Through local stakeholder engagement and dissemination of findings globally we hope to build on patient-centred, cost-effective and sustainable models of stroke rehabilitation. Trial registration number CTRI/2013/04/003557. PMID:27633636

  12. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    NASA Astrophysics Data System (ADS)

    Lehrack, S.; Duckeck, G.; Ebke, J.

    2014-06-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  13. A comprehensive health service evaluation and monitoring framework.

    PubMed

    Reeve, Carole; Humphreys, John; Wakerman, John

    2015-12-01

    To develop a framework for evaluating and monitoring a primary health care service, integrating hospital and community services. A targeted literature review of primary health service evaluation frameworks was performed to inform the development of the framework specifically for remote communities. Key principles underlying primary health care evaluation were determined and sentinel indicators developed to operationalise the evaluation framework. This framework was then validated with key stakeholders. The framework includes Donabedian's three seminal domains of structure, process and outcomes to determine health service performance. These in turn are dependent on sustainability, quality of patient care and the determinants of health to provide a comprehensive health service evaluation framework. The principles underpinning primary health service evaluation were pertinent to health services in remote contexts. Sentinel indicators were developed to fit the demographic characteristics and health needs of the population. Consultation with key stakeholders confirmed that the evaluation framework was applicable. Data collected routinely by health services can be used to operationalise the proposed health service evaluation framework. Use of an evaluation framework which links policy and health service performance to health outcomes will assist health services to improve performance as part of a continuous quality improvement cycle. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. A Hierarchical Framework for Demand-Side Frequency Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moya, Christian; Zhang, Wei; Lian, Jianming

    2014-06-02

    With large-scale plans to integrate renewable generation, more resources will be needed to compensate for the uncertainty associated with intermittent generation resources. Under such conditions, performing frequency control using only supply-side resources become not only prohibitively expensive but also technically difficult. It is therefore important to explore how a sufficient proportion of the loads could assume a routine role in frequency control to maintain the stability of the system at an acceptable cost. In this paper, a novel hierarchical decentralized framework for frequency based load control is proposed. The framework involves two decision layers. The top decision layer determines themore » optimal droop gain required from the aggregated load response on each bus using a robust decentralized control approach. The second layer consists of a large number of devices, which switch probabilistically during contingencies so that the aggregated power change matches the desired droop amount according to the updated gains. The proposed framework is based on the classical nonlinear multi-machine power system model, and can deal with timevarying system operating conditions while respecting the physical constraints of individual devices. Realistic simulation results based on a 68-bus system are provided to demonstrate the effectiveness of the proposed strategy.« less

  15. Framework and algorithms for illustrative visualizations of time-varying flows on unstructured meshes

    DOE PAGES

    Rattner, Alexander S.; Guillen, Donna Post; Joshi, Alark; ...

    2016-03-17

    Photo- and physically realistic techniques are often insufficient for visualization of fluid flow simulations, especially for 3D and time-varying studies. Substantial research effort has been dedicated to the development of non-photorealistic and illustration-inspired visualization techniques for compact and intuitive presentation of such complex datasets. However, a great deal of work has been reproduced in this field, as many research groups have developed specialized visualization software. Additionally, interoperability between illustrative visualization software is limited due to diverse processing and rendering architectures employed in different studies. In this investigation, a framework for illustrative visualization is proposed, and implemented in MarmotViz, a ParaViewmore » plug-in, enabling its use on a variety of computing platforms with various data file formats and mesh geometries. Region-of-interest identification and feature-tracking algorithms incorporated into this tool are described. Implementations of multiple illustrative effect algorithms are also presented to demonstrate the use and flexibility of this framework. Here, by providing an integrated framework for illustrative visualization of CFD data, MarmotViz can serve as a valuable asset for the interpretation of simulations of ever-growing scale.« less

  16. Transitional care for formerly incarcerated persons with HIV: protocol for a realist review.

    PubMed

    Tsang, Jenkin; Mishra, Sharmistha; Rowe, Janet; O'Campo, Patricia; Ziegler, Carolyn; Kouyoumdjian, Fiona G; Matheson, Flora I; Bayoumi, Ahmed M; Zahid, Shatabdy; Antoniou, Tony

    2017-02-13

    Little is known about the mechanisms that influence the success or failure of programs to facilitate re-engagement with health and social services for formerly incarcerated persons with HIV. This review aims to identify how interventions to address such transitions work, for whom and under what circumstances. We will use realist review methodology to conduct our analysis. We will systematically search electronic databases and grey literature for English language qualitative and quantitative studies of interventions. Two investigators will independently screen citations and full-text articles, abstract data, appraise study quality and synthesize the literature. Data analysis will include identifying context-mechanism-outcome configurations, exploring and comparing patterns in these configurations, making comparisons across contexts and developing explanatory frameworks. This review will identify mechanisms that influence the success or failure of transition interventions for formerly incarcerated individuals with HIV. The findings will be integrated with those from complementary qualitative and quantitative studies to inform future interventions. PROSPERO CRD42016040054.

  17. Multiple Ion Binding Equilibria, Reaction Kinetics, and Thermodynamics in Dynamic Models of Biochemical Pathways

    PubMed Central

    Vinnakota, Kalyan C.; Wu, Fan; Kushmerick, Martin J.; Beard, Daniel A.

    2009-01-01

    The operation of biochemical systems in vivo and in vitro is strongly influenced by complex interactions between biochemical reactants and ions such as H+, Mg2+, K+, and Ca2+. These are important second messengers in metabolic and signaling pathways that directly influence the kinetics and thermodynamics of biochemical systems. Herein we describe the biophysical theory and computational methods to account for multiple ion binding to biochemical reactants and demonstrate the crucial effects of ion binding on biochemical reaction kinetics and thermodynamics. In simulations of realistic systems, the concentrations of these ions change with time due to dynamic buffering and competitive binding. In turn, the effective thermodynamic properties vary as functions of cation concentrations and important environmental variables such as temperature and overall ionic strength. Physically realistic simulations of biochemical systems require incorporating all of these phenomena into a coherent mathematical description. Several applications to physiological systems are demonstrated based on this coherent simulation framework. PMID:19216922

  18. A poroelastic model coupled to a fluid network with applications in lung modelling.

    PubMed

    Berger, Lorenz; Bordas, Rafel; Burrowes, Kelly; Grau, Vicente; Tavener, Simon; Kay, David

    2016-01-01

    We develop a lung ventilation model based on a continuum poroelastic representation of lung parenchyma that is strongly coupled to a pipe network representation of the airway tree. The continuous system of equations is discretized using a low-order stabilised finite element method. The framework is applied to a realistic lung anatomical model derived from computed tomography data and an artificially generated airway tree to model the conducting airway region. Numerical simulations produce physiologically realistic solutions and demonstrate the effect of airway constriction and reduced tissue elasticity on ventilation, tissue stress and alveolar pressure distribution. The key advantage of the model is the ability to provide insight into the mutual dependence between ventilation and deformation. This is essential when studying lung diseases, such as chronic obstructive pulmonary disease and pulmonary fibrosis. Thus the model can be used to form a better understanding of integrated lung mechanics in both the healthy and diseased states. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Double beta decays of {sup 106}Cd

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suhonen, Jouni

    2011-12-16

    The two-neutrino (2{nu}2{beta}) and neutrinoless (0{nu}2{beta}) double beta decays of {sup 106}Cd are studied for the transitions to the ground state 0{sub gs}{sup +} and 0{sup +} and 2{sup +} excited states in {sup 106}Pd by using realistic many-body wave functions calculated in the framework of the quasiparticle random-phase approximation. Effective, G-matrix-derived nuclear forces are used in realistic single-particle model spaces. All the possible channels, {beta}{sup +}{beta}{sup +}, {beta}{sup +}EC, and ECEC, are discussed for both the 2{nu}2{beta} and 0{nu}2{beta} decays. The associated half-lives are computed and particular attention is devoted to the study of the detectability of the resonantmore » neutrinoless double electron capture (R0{nu}ECEC) process in {sup 106}Cd. The calculations of the present article constitute the thus far most complete and up-to-date investigation of the double-beta-decay properties of {sup 106}Cd.« less

  20. Water versus DNA: New insights into proton track-structure modeling in radiobiology and radiotherapy

    DOE PAGES

    Champion, Christophe; Quinto, Michele A.; Monti, Juan M.; ...

    2015-09-25

    Water is a common surrogate of DNA for modelling the charged particle-induced ionizing processes in living tissue exposed to radiations. The present study aims at scrutinizing the validity of this approximation and then revealing new insights into proton-induced energy transfers by a comparative analysis between water and realistic biological medium. In this context, a self-consistent quantum mechanical modelling of the ionization and electron capture processes is reported within the continuum distorted wave-eikonal initial state framework for both isolated water molecules and DNA components impacted by proton beams. Their respective probability of occurrence-expressed in terms of total cross sections-as well asmore » their energetic signature (potential and kinetic) are assessed in order to clearly emphasize the differences existing between realistic building blocks of living matter and the controverted water-medium surrogate. Thus the consequences in radiobiology and radiotherapy will be discussed in particular in view of treatment planning refinement aiming at better radiotherapy strategies.« less

  1. Water versus DNA: new insights into proton track-structure modelling in radiobiology and radiotherapy.

    PubMed

    Champion, C; Quinto, M A; Monti, J M; Galassi, M E; Weck, P F; Fojón, O A; Hanssen, J; Rivarola, R D

    2015-10-21

    Water is a common surrogate of DNA for modelling the charged particle-induced ionizing processes in living tissue exposed to radiations. The present study aims at scrutinizing the validity of this approximation and then revealing new insights into proton-induced energy transfers by a comparative analysis between water and realistic biological medium. In this context, a self-consistent quantum mechanical modelling of the ionization and electron capture processes is reported within the continuum distorted wave-eikonal initial state framework for both isolated water molecules and DNA components impacted by proton beams. Their respective probability of occurrence-expressed in terms of total cross sections-as well as their energetic signature (potential and kinetic) are assessed in order to clearly emphasize the differences existing between realistic building blocks of living matter and the controverted water-medium surrogate. Consequences in radiobiology and radiotherapy will be discussed in particular in view of treatment planning refinement aiming at better radiotherapy strategies.

  2. From geometry to algebra and vice versa: Realistic mathematics education principles for analyzing geometry tasks

    NASA Astrophysics Data System (ADS)

    Jupri, Al

    2017-04-01

    In this article we address how Realistic Mathematics Education (RME) principles, including the intertwinement and the reality principles, are used to analyze geometry tasks. To do so, we carried out three phases of a small-scale study. First we analyzed four geometry problems - considered as tasks inviting the use of problem solving and reasoning skills - theoretically in the light of the RME principles. Second, we tested two problems to 31 undergraduate students of mathematics education program and other two problems to 16 master students of primary mathematics education program. Finally, we analyzed student written work and compared these empirical to the theoretical results. We found that there are discrepancies between what we expected theoretically and what occurred empirically in terms of mathematization and of intertwinement of mathematical concepts from geometry to algebra and vice versa. We conclude that the RME principles provide a fruitful framework for analyzing geometry tasks that, for instance, are intended for assessing student problem solving and reasoning skills.

  3. CFD simulation of flow through heart: a perspective review.

    PubMed

    Khalafvand, S S; Ng, E Y K; Zhong, L

    2011-01-01

    The heart is an organ which pumps blood around the body by contraction of muscular wall. There is a coupled system in the heart containing the motion of wall and the motion of blood fluid; both motions must be computed simultaneously, which make biological computational fluid dynamics (CFD) difficult. The wall of the heart is not rigid and hence proper boundary conditions are essential for CFD modelling. Fluid-wall interaction is very important for real CFD modelling. There are many assumptions for CFD simulation of the heart that make it far from a real model. A realistic fluid-structure interaction modelling the structure by the finite element method and the fluid flow by CFD use more realistic coupling algorithms. This type of method is very powerful to solve the complex properties of the cardiac structure and the sensitive interaction of fluid and structure. The final goal of heart modelling is to simulate the total heart function by integrating cardiac anatomy, electrical activation, mechanics, metabolism and fluid mechanics together, as in the computational framework.

  4. A video, text, and speech-driven realistic 3-d virtual head for human-machine interface.

    PubMed

    Yu, Jun; Wang, Zeng-Fu

    2015-05-01

    A multiple inputs-driven realistic facial animation system based on 3-D virtual head for human-machine interface is proposed. The system can be driven independently by video, text, and speech, thus can interact with humans through diverse interfaces. The combination of parameterized model and muscular model is used to obtain a tradeoff between computational efficiency and high realism of 3-D facial animation. The online appearance model is used to track 3-D facial motion from video in the framework of particle filtering, and multiple measurements, i.e., pixel color value of input image and Gabor wavelet coefficient of illumination ratio image, are infused to reduce the influence of lighting and person dependence for the construction of online appearance model. The tri-phone model is used to reduce the computational consumption of visual co-articulation in speech synchronized viseme synthesis without sacrificing any performance. The objective and subjective experiments show that the system is suitable for human-machine interaction.

  5. Particle-Size-Grouping Model of Precipitation Kinetics in Microalloyed Steels

    NASA Astrophysics Data System (ADS)

    Xu, Kun; Thomas, Brian G.

    2012-03-01

    The formation, growth, and size distribution of precipitates greatly affects the microstructure and properties of microalloyed steels. Computational particle-size-grouping (PSG) kinetic models based on population balances are developed to simulate precipitate particle growth resulting from collision and diffusion mechanisms. First, the generalized PSG method for collision is explained clearly and verified. Then, a new PSG method is proposed to model diffusion-controlled precipitate nucleation, growth, and coarsening with complete mass conservation and no fitting parameters. Compared with the original population-balance models, this PSG method saves significant computation and preserves enough accuracy to model a realistic range of particle sizes. Finally, the new PSG method is combined with an equilibrium phase fraction model for plain carbon steels and is applied to simulate the precipitated fraction of aluminum nitride and the size distribution of niobium carbide during isothermal aging processes. Good matches are found with experimental measurements, suggesting that the new PSG method offers a promising framework for the future development of realistic models of precipitation.

  6. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  7. Simulation of patient flow in multiple healthcare units using process and data mining techniques for model identification.

    PubMed

    Kovalchuk, Sergey V; Funkner, Anastasia A; Metsker, Oleg G; Yakovlev, Aleksey N

    2018-06-01

    An approach to building a hybrid simulation of patient flow is introduced with a combination of data-driven methods for automation of model identification. The approach is described with a conceptual framework and basic methods for combination of different techniques. The implementation of the proposed approach for simulation of the acute coronary syndrome (ACS) was developed and used in an experimental study. A combination of data, text, process mining techniques, and machine learning approaches for the analysis of electronic health records (EHRs) with discrete-event simulation (DES) and queueing theory for the simulation of patient flow was proposed. The performed analysis of EHRs for ACS patients enabled identification of several classes of clinical pathways (CPs) which were used to implement a more realistic simulation of the patient flow. The developed solution was implemented using Python libraries (SimPy, SciPy, and others). The proposed approach enables more a realistic and detailed simulation of the patient flow within a group of related departments. An experimental study shows an improved simulation of patient length of stay for ACS patient flow obtained from EHRs in Almazov National Medical Research Centre in Saint Petersburg, Russia. The proposed approach, methods, and solutions provide a conceptual, methodological, and programming framework for the implementation of a simulation of complex and diverse scenarios within a flow of patients for different purposes: decision making, training, management optimization, and others. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. A conceptual framework of computations in mid-level vision.

    PubMed

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words-or, rather, descriptors-capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations.

  9. Using Machine Learning in Adversarial Environments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren Leon Davis

    Intrusion/anomaly detection systems are among the first lines of cyber defense. Commonly, they either use signatures or machine learning (ML) to identify threats, but fail to account for sophisticated attackers trying to circumvent them. We propose to embed machine learning within a game theoretic framework that performs adversarial modeling, develops methods for optimizing operational response based on ML, and integrates the resulting optimization codebase into the existing ML infrastructure developed by the Hybrid LDRD. Our approach addresses three key shortcomings of ML in adversarial settings: 1) resulting classifiers are typically deterministic and, therefore, easy to reverse engineer; 2) ML approachesmore » only address the prediction problem, but do not prescribe how one should operationalize predictions, nor account for operational costs and constraints; and 3) ML approaches do not model attackers’ response and can be circumvented by sophisticated adversaries. The principal novelty of our approach is to construct an optimization framework that blends ML, operational considerations, and a model predicting attackers reaction, with the goal of computing optimal moving target defense. One important challenge is to construct a realistic model of an adversary that is tractable, yet realistic. We aim to advance the science of attacker modeling by considering game-theoretic methods, and by engaging experimental subjects with red teaming experience in trying to actively circumvent an intrusion detection system, and learning a predictive model of such circumvention activities. In addition, we will generate metrics to test that a particular model of an adversary is consistent with available data.« less

  10. Real-time video streaming in mobile cloud over heterogeneous wireless networks

    NASA Astrophysics Data System (ADS)

    Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos

    2012-06-01

    Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets are captured for analytical purposes on the mobile user node. Experimental results are obtained and analysed. Future work is identified towards further improvement of the current design and implementation. With this new mobile video networking concept and paradigm implemented and evaluated, results and observations obtained from this study would form the basis of a more in-depth, comprehensive understanding of various challenges and opportunities in supporting high-quality real-time video streaming in mobile cloud over heterogeneous wireless networks.

  11. Flocking algorithm for autonomous flying robots.

    PubMed

    Virágh, Csaba; Vásárhelyi, Gábor; Tarcai, Norbert; Szörényi, Tamás; Somorjai, Gergő; Nepusz, Tamás; Vicsek, Tamás

    2014-06-01

    Animal swarms displaying a variety of typical flocking patterns would not exist without the underlying safe, optimal and stable dynamics of the individuals. The emergence of these universal patterns can be efficiently reconstructed with agent-based models. If we want to reproduce these patterns with artificial systems, such as autonomous aerial robots, agent-based models can also be used in their control algorithms. However, finding the proper algorithms and thus understanding the essential characteristics of the emergent collective behaviour requires thorough and realistic modeling of the robot and also the environment. In this paper, we first present an abstract mathematical model of an autonomous flying robot. The model takes into account several realistic features, such as time delay and locality of communication, inaccuracy of the on-board sensors and inertial effects. We present two decentralized control algorithms. One is based on a simple self-propelled flocking model of animal collective motion, the other is a collective target tracking algorithm. Both algorithms contain a viscous friction-like term, which aligns the velocities of neighbouring agents parallel to each other. We show that this term can be essential for reducing the inherent instabilities of such a noisy and delayed realistic system. We discuss simulation results on the stability of the control algorithms, and perform real experiments to show the applicability of the algorithms on a group of autonomous quadcopters. In our case, bio-inspiration works in two ways. On the one hand, the whole idea of trying to build and control a swarm of robots comes from the observation that birds tend to flock to optimize their behaviour as a group. On the other hand, by using a realistic simulation framework and studying the group behaviour of autonomous robots we can learn about the major factors influencing the flight of bird flocks.

  12. DC electrophoresis and viscosity of realistic salt-free concentrated suspensions: non-equilibrium dissociation-association processes.

    PubMed

    Ruiz-Reina, Emilio; Carrique, Félix; Lechuga, Luis

    2014-03-01

    Most of the suspensions usually found in industrial applications are concentrated, aqueous and in contact with the atmospheric CO2. The case of suspensions with a high concentration of added salt is relatively well understood and has been considered in many studies. In this work we are concerned with the case of concentrated suspensions that have no ions different than: (1) those stemming from the charged colloidal particles (the added counterions, that counterbalance their surface charge); (2) the H(+) and OH(-) ions from water dissociation, and (3) the ions generated by the atmospheric CO2 contamination. We call this kind of systems "realistic salt-free suspensions". We show some theoretical results about the electrophoretic mobility of a colloidal particle and the electroviscous effect of realistic salt-free concentrated suspensions. The theoretical framework is based on a cell model that accounts for particle-particle interactions in concentrated suspensions, which has been successfully applied to many different phenomena in concentrated suspensions. On the other hand, the water dissociation and CO2 contamination can be described following two different levels of approximation: (a) by local equilibrium mass-action equations, because it is supposed that the reactions are so fast that chemical equilibrium is attained everywhere in the suspension, or (b) by non-equilibrium dissociation-association kinetic equations, because it is considered that some reactions are not rapid enough to ensure local chemical equilibrium. Both approaches give rise to different results in the range from dilute to semidilute suspensions, causing possible discrepancies when comparing standard theories and experiments concerning transport properties of realistic salt-free suspensions. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Development of a new approach to cumulative effects assessment: a northern river ecosystem example.

    PubMed

    Dubé, Monique; Johnson, Brian; Dunn, Gary; Culp, Joseph; Cash, Kevin; Munkittrick, Kelly; Wong, Isaac; Hedley, Kathlene; Booty, William; Lam, David; Resler, Oskar; Storey, Alex

    2006-02-01

    If sustainable development of Canadian waters is to be achieved, a realistic and manageable framework is required for assessing cumulative effects. The objective of this paper is to describe an approach for aquatic cumulative effects assessment that was developed under the Northern Rivers Ecosystem Initiative. The approach is based on a review of existing monitoring practices in Canada and the presence of existing thresholds for aquatic ecosystem health assessments. It suggests that a sustainable framework is possible for cumulative effects assessment of Canadian waters that would result in integration of national indicators of aquatic health, integration of national initiatives (e.g., water quality index, environmental effects monitoring), and provide an avenue where long-term monitoring programs could be integrated with baseline and follow-up monitoring conducted under the environmental assessment process.

  14. An Analytical Model for the Performance Analysis of Concurrent Transmission in IEEE 802.15.4

    PubMed Central

    Gezer, Cengiz; Zanella, Alberto; Verdone, Roberto

    2014-01-01

    Interference is a serious cause of performance degradation for IEEE802.15.4 devices. The effect of concurrent transmissions in IEEE 802.15.4 has been generally investigated by means of simulation or experimental activities. In this paper, a mathematical framework for the derivation of chip, symbol and packet error probability of a typical IEEE 802.15.4 receiver in the presence of interference is proposed. Both non-coherent and coherent demodulation schemes are considered by our model under the assumption of the absence of thermal noise. Simulation results are also added to assess the validity of the mathematical framework when the effect of thermal noise cannot be neglected. Numerical results show that the proposed analysis is in agreement with the measurement results on the literature under realistic working conditions. PMID:24658624

  15. An analytical model for the performance analysis of concurrent transmission in IEEE 802.15.4.

    PubMed

    Gezer, Cengiz; Zanella, Alberto; Verdone, Roberto

    2014-03-20

    Interference is a serious cause of performance degradation for IEEE802.15.4 devices. The effect of concurrent transmissions in IEEE 802.15.4 has been generally investigated by means of simulation or experimental activities. In this paper, a mathematical framework for the derivation of chip, symbol and packet error probability of a typical IEEE 802.15.4 receiver in the presence of interference is proposed. Both non-coherent and coherent demodulation schemes are considered by our model under the assumption of the absence of thermal noise. Simulation results are also added to assess the validity of the mathematical framework when the effect of thermal noise cannot be neglected. Numerical results show that the proposed analysis is in agreement with the measurement results on the literature under realistic working conditions.

  16. Evaluating the Development of Biocatalytic Technology for the Targeted Removal of Perchlorate from Drinking Water.

    PubMed

    Hutchison, Justin M; Guest, Jeremy S; Zilles, Julie L

    2017-06-20

    Removing micropollutants is challenging in part because of their toxicity at low concentrations. A biocatalytic approach could harness the high affinity of enzymes for their substrates to address this challenge. The potential of biocatalysis relative to mature (nonselective ion exchange, selective ion exchange, and whole-cell biological reduction) and emerging (catalysis) perchlorate-removal technologies was evaluated through a quantitative sustainable design framework, and research objectives were prioritized to advance economic and environmental sustainability. In its current undeveloped state, the biocatalytic technology was approximately 1 order of magnitude higher in cost and environmental impact than nonselective ion exchange. Biocatalyst production was highly correlated with cost and impact. Realistic improvement scenarios targeting biocatalyst yield, biocatalyst immobilization for reuse, and elimination of an electron shuttle could reduce total costs to $0.034 m -3 and global warming potential (GWP) to 0.051 kg CO 2 eq m -3 : roughly 6.5% of cost and 7.3% of GWP of the background from drinking water treatment and competitive with the best performing technology, selective ion exchange. With less stringent perchlorate regulatory limits, ion exchange technologies had increased cost and impact, in contrast to biocatalytic and catalytic technologies. Targeted advances in biocatalysis could provide affordable and sustainable treatment options to protect the public from micropollutants.

  17. A Weather Radar Simulator for the Evaluation of Polarimetric Phased Array Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byrd, Andrew D.; Ivic, Igor R.; Palmer, Robert D.

    A radar simulator capable of generating time series data for a polarimetric phased array weather radar has been designed and implemented. The received signals are composed from a high-resolution numerical prediction weather model. Thousands of scattering centers, each with an independent randomly generated Doppler spectrum, populate the field of view of the radar. The moments of the scattering center spectra are derived from the numerical weather model, and the scattering center positions are updated based on the three-dimensional wind field. In order to accurately emulate the effects of the system-induced cross-polar contamination, the array is modeled using a complete setmore » of dual-polarization radiation patterns. The simulator offers reconfigurable element patterns and positions as well as access to independent time series data for each element, resulting in easy implementation of any beamforming method. It also allows for arbitrary waveform designs and is able to model the effects of quantization on waveform performance. Simultaneous, alternating, quasi-simultaneous, and pulse-to-pulse phase coded modes of polarimetric signal transmission have been implemented. This framework allows for realistic emulation of the effects of cross-polar fields on weather observations, as well as the evaluation of possible techniques for the mitigation of those effects.« less

  18. Root-cause estimation of ultrasonic scattering signatures within a complex textured titanium

    NASA Astrophysics Data System (ADS)

    Blackshire, James L.; Na, Jeong K.; Freed, Shaun

    2016-02-01

    The nondestructive evaluation of polycrystalline materials has been an active area of research for many decades, and continues to be an area of growth in recent years. Titanium alloys in particular have become a critical material system used in modern turbine engine applications, where an evaluation of the local microstructure properties of engine disk/blade components is desired for performance and remaining life assessments. Current NDE methods are often limited to estimating ensemble material properties or detecting localized voids, inclusions, or damage features within a material. Recent advances in computational NDE and material science characterization methods are providing new and unprecedented access to heterogeneous material properties, which permits microstructure-sensing interactions to be studied in detail. In the present research, Integrated Computational Materials Engineering (ICME) methods and tools are being leveraged to gain a comprehensive understanding of root-cause ultrasonic scattering processes occurring within a textured titanium aerospace material. A combination of destructive, nondestructive, and computational methods are combined within the ICME framework to collect, holistically integrate, and study complex ultrasound scattering using realistic 2-dimensional representations of the microstructure properties. Progress towards validating the computational sensing methods are discussed, along with insight into the key scattering processes occurring within the bulk microstructure, and how they manifest in pulse-echo immersion ultrasound measurements.

  19. Determining Electrical Properties Based on B1 Fields Measured in an MR Scanner Using a Multi-channel Transmit/Receive Coil: a General Approach

    PubMed Central

    Liu, Jiaen; Zhang, Xiaotong; Van de Moortele, Pierre-Francois; Schmitter, Sebastian

    2013-01-01

    Electrical Property Tomography (EPT) is a recently developed noninvasive technology to image the electrical conductivity and permittivity of biological tissues at Larmor frequency in Magnetic Resonance (MR) scanners. The absolute phase of the complex radio-frequency (RF) magnetic field (B1) is necessary for electrical property calculation. However, due to the lack of practical methods to directly measure the absolute B1 phases, current EPT techniques have been achieved with B1 phase estimation based on certain assumptions on object anatomy, coil structure and/or electromagnetic wave behavior associated with the main magnetic field, limiting EPT from a larger variety of applications. In this study, using a multi-channel transmit/receive coil, the framework of a new general approach for EPT has been introduced, which is independent on the assumptions utilized in previous studies. Using a human head model with realistic geometry, a series of computer simulations at 7T were conducted to evaluate the proposed method under different noise levels. Results showed that the proposed method can be used to reconstruct the conductivity and permittivity images with noticeable accuracy and stability. The feasibility of this approach was further evaluated in a phantom experiment at 7T. PMID:23743673

  20. An evaluation paradigm for cumulative impact analysis

    NASA Astrophysics Data System (ADS)

    Stakhiv, Eugene Z.

    1988-09-01

    Cumulative impact analysis is examined from a conceptual decision-making perspective, focusing on its implicit and explicit purposes as suggested within the policy and procedures for environmental impact analysis of the National Environmental Policy Act of 1969 (NEPA) and its implementing regulations. In this article it is also linked to different evaluation and decision-making conventions, contrasting a regulatory context with a comprehensive planning framework. The specific problems that make the application of cumulative impact analysis a virtually intractable evaluation requirement are discussed in connection with the federal regulation of wetlands uses. The relatively familiar US Army Corps of Engineers' (the Corps) permit program, in conjunction with the Environmental Protection Agency's (EPA) responsibilities in managing its share of the Section 404 regulatory program requirements, is used throughout as the realistic context for highlighting certain pragmatic evaluation aspects of cumulative impact assessment. To understand the purposes of cumulative impact analysis (CIA), a key distinction must be made between the implied comprehensive and multiobjective evaluation purposes of CIA, promoted through the principles and policies contained in NEPA, and the more commonly conducted and limited assessment of cumulative effects (ACE), which focuses largely on the ecological effects of human actions. Based on current evaluation practices within the Corps' and EPA's permit programs, it is shown that the commonly used screening approach to regulating wetlands uses is not compatible with the purposes of CIA, nor is the environmental impact statement (EIS) an appropriate vehicle for evaluating the variety of objectives and trade-offs needed as part of CIA. A heuristic model that incorporates the basic elements of CIA is developed, including the idea of trade-offs among social, economic, and environmental protection goals carried out within the context of environmental carrying capacity.

  1. Validation of Shared and Specific Independent Component Analysis (SSICA) for Between-Group Comparisons in fMRI

    PubMed Central

    Maneshi, Mona; Vahdat, Shahabeddin; Gotman, Jean; Grova, Christophe

    2016-01-01

    Independent component analysis (ICA) has been widely used to study functional magnetic resonance imaging (fMRI) connectivity. However, the application of ICA in multi-group designs is not straightforward. We have recently developed a new method named “shared and specific independent component analysis” (SSICA) to perform between-group comparisons in the ICA framework. SSICA is sensitive to extract those components which represent a significant difference in functional connectivity between groups or conditions, i.e., components that could be considered “specific” for a group or condition. Here, we investigated the performance of SSICA on realistic simulations, and task fMRI data and compared the results with one of the state-of-the-art group ICA approaches to infer between-group differences. We examined SSICA robustness with respect to the number of allowable extracted specific components and between-group orthogonality assumptions. Furthermore, we proposed a modified formulation of the back-reconstruction method to generate group-level t-statistics maps based on SSICA results. We also evaluated the consistency and specificity of the extracted specific components by SSICA. The results on realistic simulated and real fMRI data showed that SSICA outperforms the regular group ICA approach in terms of reconstruction and classification performance. We demonstrated that SSICA is a powerful data-driven approach to detect patterns of differences in functional connectivity across groups/conditions, particularly in model-free designs such as resting-state fMRI. Our findings in task fMRI show that SSICA confirms results of the general linear model (GLM) analysis and when combined with clustering analysis, it complements GLM findings by providing additional information regarding the reliability and specificity of networks. PMID:27729843

  2. Influence of Superparameterization and a Higher-Order Turbulence Closure on Rainfall Bias Over Amazonia in Community Atmosphere Model Version 5: How Parameterization Changes Rainfall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Kai; Fu, Rong; Shaikh, Muhammad J.

    We evaluate the Community Atmosphere Model Version 5 (CAM5) with a higher-order turbulence closure scheme, named Cloud Layers Unified By Binomials (CLUBB), and a Multiscale Modeling Framework (MMF) with two different microphysics configurations to investigate their influences on rainfall simulations over Southern Amazonia. The two different microphysics configurations in MMF are the one-moment cloud microphysics without aerosol treatment (SAM1MOM) and two-moment cloud microphysics coupled with aerosol treatment (SAM2MOM). Results show that both MMF-SAM2MOM and CLUBB effectively reduce the low biases of rainfall, mainly during the wet season. The CLUBB reduces low biases of humidity in the lower troposphere with furthermore » reduced shallow clouds. The latter enables more surface solar flux, leading to stronger convection and more rainfall. MMF, especially MMF-SAM2MOM, unstablizes the atmosphere with more moisture and higher atmospheric temperatures in the atmospheric boundary layer, allowing the growth of more extreme convection and further generating more deep convection. MMF-SAM2MOM significantly increases rainfall in the afternoon, but it does not reduce the early bias of the diurnal rainfall peak; LUBB, on the other hand, delays the afternoon peak time and produces more precipitation in the early morning, due to more realistic gradual transition between shallow and deep convection. MMF appears to be able to realistically capture the observed increase of relative humidity prior to deep convection, especially with its two-moment configuration. In contrast, in CAM5 and CAM5 with CLUBB, occurrence of deep convection in these models appears to be a result of stronger heating rather than higher relative humidity.« less

  3. Collaborative action around implementation in Collaborations for Leadership in Applied Health Research and Care: towards a programme theory.

    PubMed

    Rycroft-Malone, Jo; Wilkinson, Joyce; Burton, Christopher R; Harvey, Gill; McCormack, Brendan; Graham, Ian; Staniszewska, Sophie

    2013-10-01

    In theory, greater interaction between researchers and practitioners should result in increased potential for implementation. However, we know little about whether this is the case, or what mechanisms might operate to make it happen. This paper reports findings from a study that is identifying and tracking implementation mechanisms, processes, influences and impacts in real time, over time in the Collaborations for Leadership in Applied Health Research and Care (CLAHRCs). This is a longitudinal, realist evaluation case study. The development of the conceptual framework and initial hypotheses involved literature reviewing and stakeholder consultation. Primary data were collected through interviews, observations and documents within three CLAHRCs, and analysed thematically against the framework and hypotheses. The first round of data collection shows that the mechanisms of collaborative action, relationship building, engagement, motivation, knowledge exchange and learning are important to the processes and outcomes of CLAHRCs' activity, including their capacity for implementation. These mechanisms operated in different contexts such as competing agendas, availability of resources and the CLAHRCs' brand. Contexts and mechanisms result in different impact, including the CLAHRCs' approach to implementation, quality of collaboration, commitment and ownership, and degree of sharing and managing knowledge. Emerging features of a middle range theory of implementation within collaboration include alignment in organizational structures and cognitive processes, history of partnerships, responsiveness and resilience in rapidly changing contexts. CLARHCs' potential to mobilize knowledge may be further realized by how they develop insights into their function as collaborative entities.

  4. GOCE gravity field simulation based on actual mission scenario

    NASA Astrophysics Data System (ADS)

    Pail, R.; Goiginger, H.; Mayrhofer, R.; Höck, E.; Schuh, W.-D.; Brockmann, J. M.; Krasbutter, I.; Fecher, T.; Gruber, T.

    2009-04-01

    In the framework of the ESA-funded project "GOCE High-level Processing Facility", an operational hardware and software system for the scientific processing (Level 1B to Level 2) of GOCE data has been set up by the European GOCE Gravity Consortium EGG-C. One key component of this software system is the processing of a spherical harmonic Earth's gravity field model and the corresponding full variance-covariance matrix from the precise GOCE orbit and calibrated and corrected satellite gravity gradiometry (SGG) data. In the framework of the time-wise approach a combination of several processing strategies for the optimum exploitation of the information content of the GOCE data has been set up: The Quick-Look Gravity Field Analysis is applied to derive a fast diagnosis of the GOCE system performance and to monitor the quality of the input data. In the Core Solver processing a rigorous high-precision solution of the very large normal equation systems is derived by applying parallel processing techniques on a PC cluster. Before the availability of real GOCE data, by means of a realistic numerical case study, which is based on the actual GOCE orbit and mission scenario and simulation data stemming from the most recent ESA end-to-end simulation, the expected GOCE gravity field performance is evaluated. Results from this simulation as well as recently developed features of the software system are presented. Additionally some aspects on data combination with complementary data sources are addressed.

  5. SU-F-BRF-01: A GPU Framework for Developing Interactive High-Resolution Patient-Specific Biomechanical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neylon, J; Qi, S; Sheng, K

    2014-06-15

    Purpose: To develop a GPU-based framework that can generate highresolution and patient-specific biomechanical models from a given simulation CT and contoured structures, optimized to run at interactive speeds, for addressing adaptive radiotherapy objectives. Method: A Massspring-damping (MSD) model was generated from a given simulation CT. The model's mass elements were generated for every voxel of anatomy, and positioned in a deformation space in the GPU memory. MSD connections were established between neighboring mass elements in a dense distribution. Contoured internal structures allowed control over elastic material properties of different tissues. Once the model was initialized in GPU memory, skeletal anatomymore » was actuated using rigid-body transformations, while soft tissues were governed by elastic corrective forces and constraints, which included tensile forces, shear forces, and spring damping forces. The model was validated by applying a known load to a soft tissue block and comparing the observed deformation to ground truth calculations from established elastic mechanics. Results: Our analyses showed that both local and global load experiments yielded results with a correlation coefficient R{sup 2} > 0.98 compared to ground truth. Models were generated for several anatomical regions. Head and neck models accurately simulated posture changes by rotating the skeletal anatomy in three dimensions. Pelvic models were developed for realistic deformations for changes in bladder volume. Thoracic models demonstrated breast deformation due to gravity when changing treatment position from supine to prone. The GPU framework performed at greater than 30 iterations per second for over 1 million mass elements with up to 26 MSD connections each. Conclusions: Realistic simulations of site-specific, complex posture and physiological changes were simulated at interactive speeds using patient data. Incorporating such a model with live patient tracking would facilitate real time assessment of variations of the actual anatomy and delivered dose for adaptive intervention and re-planning.« less

  6. Can metric-based approaches really improve multi-model climate projections? A perfect model framework applied to summer temperature change in France.

    NASA Astrophysics Data System (ADS)

    Boé, Julien; Terray, Laurent

    2014-05-01

    Ensemble approaches for climate change projections have become ubiquitous. Because of large model-to-model variations and, generally, lack of rationale for the choice of a particular climate model against others, it is widely accepted that future climate change and its impacts should not be estimated based on a single climate model. Generally, as a default approach, the multi-model ensemble mean (MMEM) is considered to provide the best estimate of climate change signals. The MMEM approach is based on the implicit hypothesis that all the models provide equally credible projections of future climate change. This hypothesis is unlikely to be true and ideally one would want to give more weight to more realistic models. A major issue with this alternative approach lies in the assessment of the relative credibility of future climate projections from different climate models, as they can only be evaluated against present-day observations: which present-day metric(s) should be used to decide which models are "good" and which models are "bad" in the future climate? Once a supposedly informative metric has been found, other issues arise. What is the best statistical method to combine multiple models results taking into account their relative credibility measured by a given metric? How to be sure in the end that the metric-based estimate of future climate change is not in fact less realistic than the MMEM? It is impossible to provide strict answers to those questions in the climate change context. Yet, in this presentation, we propose a methodological approach based on a perfect model framework that could bring some useful elements of answer to the questions previously mentioned. The basic idea is to take a random climate model in the ensemble and treat it as if it were the truth (results of this model, in both past and future climate, are called "synthetic observations"). Then, all the other members from the multi-model ensemble are used to derive thanks to a metric-based approach a posterior estimate of climate change, based on the synthetic observation of the metric. Finally, it is possible to compare the posterior estimate to the synthetic observation of future climate change to evaluate the skill of the method. The main objective of this presentation is to describe and apply this perfect model framework to test different methodological issues associated with non-uniform model weighting and similar metric-based approaches. The methodology presented is general, but will be applied to the specific case of summer temperature change in France, for which previous works have suggested potentially useful metrics associated with soil-atmosphere and cloud-temperature interactions. The relative performances of different simple statistical approaches to combine multiple model results based on metrics will be tested. The impact of ensemble size, observational errors, internal variability, and model similarity will be characterized. The potential improvements associated with metric-based approaches compared to the MMEM is terms of errors and uncertainties will be quantified.

  7. Monte Carlo generators for studies of the 3D structure of the nucleon

    DOE PAGES

    Avakian, Harut; D'Alesio, U.; Murgia, F.

    2015-01-23

    In this study, extraction of transverse momentum and space distributions of partons from measurements of spin and azimuthal asymmetries requires development of a self consistent analysis framework, accounting for evolution effects, and allowing control of systematic uncertainties due to variations of input parameters and models. Development of realistic Monte-Carlo generators, accounting for TMD evolution effects, spin-orbit and quark-gluon correlations will be crucial for future studies of quark-gluon dynamics in general and 3D structure of the nucleon in particular.

  8. Modeling and simulation of high dimensional stochastic multiscale PDE systems at the exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zabaras, Nicolas J.

    2016-11-08

    Predictive Modeling of multiscale and Multiphysics systems requires accurate data driven characterization of the input uncertainties, and understanding of how they propagate across scales and alter the final solution. This project develops a rigorous mathematical framework and scalable uncertainty quantification algorithms to efficiently construct realistic low dimensional input models, and surrogate low complexity systems for the analysis, design, and control of physical systems represented by multiscale stochastic PDEs. The work can be applied to many areas including physical and biological processes, from climate modeling to systems biology.

  9. A Bumpy Road and a Bridge too Far? An Analysis of the Realistic Bridging and Horizontal Construction Capabilities of the Canadian Military Engineers in the Force 2013 Structure

    DTIC Science & Technology

    2012-05-17

    the national security strategy? Liddell Hart describes this difficult challenge faced by all socially conservative states. They must find the type of... challenges and circumstances while working within an economic framework that necessitates choice.”23 TTCP describes the CBP process as a method that involves...a capability gap. Then the Army purchased quick solutions and fielded them for immediate deployment to theatre. Their generation satisfied an urgent

  10. People and Teams Matter in Organizational Change: Professionals’ and Managers’ Experiences of Changing Governance and Incentives in Primary Care

    PubMed Central

    Allan, Helen T; Brearley, Sally; Byng, Richard; Christian, Sara; Clayton, Julie; Mackintosh, Maureen; Price, Linnie; Smith, Pam; Ross, Fiona

    2014-01-01

    ObjectivesTo explore the experiences of governance and incentives during organizational change for managers and clinical staff. Study SettingThree primary care settings in England in 2006–2008. Study DesignData collection involved three group interviews with 32 service users, individual interviews with 32 managers, and 56 frontline professionals in three sites. The Realistic Evaluation framework was used in analysis to examine the effects of new policies and their implementation. Principal FindingsIntegrating new interprofessional teams to work effectively is a slow process, especially if structures in place do not acknowledge the painful feelings involved in change and do not support staff during periods of uncertainty. ConclusionsEliciting multiple perspectives, often dependent on individual occupational positioning or place in new team configurations, illuminates the need to incorporate the emotional as well as technocratic and system factors when implementing change. Some suggestions are made for facilitating change in health care systems. These are discussed in the context of similar health care reform initiatives in the United States. PMID:23829292

  11. Hierarchical bounding structures for efficient virial computations: Towards a realistic molecular description of cholesterics

    NASA Astrophysics Data System (ADS)

    Tortora, Maxime M. C.; Doye, Jonathan P. K.

    2017-12-01

    We detail the application of bounding volume hierarchies to accelerate second-virial evaluations for arbitrary complex particles interacting through hard and soft finite-range potentials. This procedure, based on the construction of neighbour lists through the combined use of recursive atom-decomposition techniques and binary overlap search schemes, is shown to scale sub-logarithmically with particle resolution in the case of molecular systems with high aspect ratios. Its implementation within an efficient numerical and theoretical framework based on classical density functional theory enables us to investigate the cholesteric self-assembly of a wide range of experimentally relevant particle models. We illustrate the method through the determination of the cholesteric behavior of hard, structurally resolved twisted cuboids, and report quantitative evidence of the long-predicted phase handedness inversion with increasing particle thread angles near the phenomenological threshold value of 45°. Our results further highlight the complex relationship between microscopic structure and helical twisting power in such model systems, which may be attributed to subtle geometric variations of their chiral excluded-volume manifold.

  12. A dataset of stereoscopic images and ground-truth disparity mimicking human fixations in peripersonal space

    PubMed Central

    Canessa, Andrea; Gibaldi, Agostino; Chessa, Manuela; Fato, Marco; Solari, Fabio; Sabatini, Silvio P.

    2017-01-01

    Binocular stereopsis is the ability of a visual system, belonging to a live being or a machine, to interpret the different visual information deriving from two eyes/cameras for depth perception. From this perspective, the ground-truth information about three-dimensional visual space, which is hardly available, is an ideal tool both for evaluating human performance and for benchmarking machine vision algorithms. In the present work, we implemented a rendering methodology in which the camera pose mimics realistic eye pose for a fixating observer, thus including convergent eye geometry and cyclotorsion. The virtual environment we developed relies on highly accurate 3D virtual models, and its full controllability allows us to obtain the stereoscopic pairs together with the ground-truth depth and camera pose information. We thus created a stereoscopic dataset: GENUA PESTO—GENoa hUman Active fixation database: PEripersonal space STereoscopic images and grOund truth disparity. The dataset aims to provide a unified framework useful for a number of problems relevant to human and computer vision, from scene exploration and eye movement studies to 3D scene reconstruction. PMID:28350382

  13. A Two-Stage Framework for 3D Face Reconstruction from RGBD Images.

    PubMed

    Wang, Kangkan; Wang, Xianwang; Pan, Zhigeng; Liu, Kai

    2014-08-01

    This paper proposes a new approach for 3D face reconstruction with RGBD images from an inexpensive commodity sensor. The challenges we face are: 1) substantial random noise and corruption are present in low-resolution depth maps; and 2) there is high degree of variability in pose and face expression. We develop a novel two-stage algorithm that effectively maps low-quality depth maps to realistic face models. Each stage is targeted toward a certain type of noise. The first stage extracts sparse errors from depth patches through the data-driven local sparse coding, while the second stage smooths noise on the boundaries between patches and reconstructs the global shape by combining local shapes using our template-based surface refinement. Our approach does not require any markers or user interaction. We perform quantitative and qualitative evaluations on both synthetic and real test sets. Experimental results show that the proposed approach is able to produce high-resolution 3D face models with high accuracy, even if inputs are of low quality, and have large variations in viewpoint and face expression.

  14. A Bayesian Approach to Genome/Linguistic Relationships in Native South Americans

    PubMed Central

    Amorim, Carlos Eduardo Guerra; Bisso-Machado, Rafael; Ramallo, Virginia; Bortolini, Maria Cátira; Bonatto, Sandro Luis; Salzano, Francisco Mauro; Hünemeier, Tábita

    2013-01-01

    The relationship between the evolution of genes and languages has been studied for over three decades. These studies rely on the assumption that languages, as many other cultural traits, evolve in a gene-like manner, accumulating heritable diversity through time and being subjected to evolutionary mechanisms of change. In the present work we used genetic data to evaluate South American linguistic classifications. We compared discordant models of language classifications to the current Native American genome-wide variation using realistic demographic models analyzed under an Approximate Bayesian Computation (ABC) framework. Data on 381 STRs spread along the autosomes were gathered from the literature for populations representing the five main South Amerindian linguistic groups: Andean, Arawakan, Chibchan-Paezan, Macro-Jê, and Tupí. The results indicated a higher posterior probability for the classification proposed by J.H. Greenberg in 1987, although L. Campbell's 1997 classification cannot be ruled out. Based on Greenberg's classification, it was possible to date the time of Tupí-Arawakan divergence (2.8 kya), and the time of emergence of the structure between present day major language groups in South America (3.1 kya). PMID:23696865

  15. A bayesian approach to genome/linguistic relationships in native South Americans.

    PubMed

    Amorim, Carlos Eduardo Guerra; Bisso-Machado, Rafael; Ramallo, Virginia; Bortolini, Maria Cátira; Bonatto, Sandro Luis; Salzano, Francisco Mauro; Hünemeier, Tábita

    2013-01-01

    The relationship between the evolution of genes and languages has been studied for over three decades. These studies rely on the assumption that languages, as many other cultural traits, evolve in a gene-like manner, accumulating heritable diversity through time and being subjected to evolutionary mechanisms of change. In the present work we used genetic data to evaluate South American linguistic classifications. We compared discordant models of language classifications to the current Native American genome-wide variation using realistic demographic models analyzed under an Approximate Bayesian Computation (ABC) framework. Data on 381 STRs spread along the autosomes were gathered from the literature for populations representing the five main South Amerindian linguistic groups: Andean, Arawakan, Chibchan-Paezan, Macro-Jê, and Tupí. The results indicated a higher posterior probability for the classification proposed by J.H. Greenberg in 1987, although L. Campbell's 1997 classification cannot be ruled out. Based on Greenberg's classification, it was possible to date the time of Tupí-Arawakan divergence (2.8 kya), and the time of emergence of the structure between present day major language groups in South America (3.1 kya).

  16. Optimal exposure techniques for iodinated contrast enhanced breast CT

    NASA Astrophysics Data System (ADS)

    Glick, Stephen J.; Makeev, Andrey

    2016-03-01

    Screening for breast cancer using mammography has been very successful in the effort to reduce breast cancer mortality, and its use has largely resulted in the 30% reduction in breast cancer mortality observed since 1990 [1]. However, diagnostic mammography remains an area of breast imaging that is in great need for improvement. One imaging modality proposed for improving the accuracy of diagnostic workup is iodinated contrast-enhanced breast CT [2]. In this study, a mathematical framework is used to evaluate optimal exposure techniques for contrast-enhanced breast CT. The ideal observer signal-to-noise ratio (i.e., d') figure-of-merit is used to provide a task performance based assessment of optimal acquisition parameters under the assumptions of a linear, shift-invariant imaging system. A parallel-cascade model was used to estimate signal and noise propagation through the detector, and a realistic lesion model with iodine uptake was embedded into a structured breast background. Ideal observer performance was investigated across kVp settings, filter materials, and filter thickness. Results indicated many kVp spectra/filter combinations can improve performance over currently used x-ray spectra.

  17. Multiscale Modeling of Ultra High Temperature Ceramics (UHTC) ZrB2 and HfB2: Application to Lattice Thermal Conductivity

    NASA Technical Reports Server (NTRS)

    Lawson, John W.; Daw, Murray S.; Squire, Thomas H.; Bauschlicher, Charles W.

    2012-01-01

    We are developing a multiscale framework in computational modeling for the ultra high temperature ceramics (UHTC) ZrB2 and HfB2. These materials are characterized by high melting point, good strength, and reasonable oxidation resistance. They are candidate materials for a number of applications in extreme environments including sharp leading edges of hypersonic aircraft. In particular, we used a combination of ab initio methods, atomistic simulations and continuum computations to obtain insights into fundamental properties of these materials. Ab initio methods were used to compute basic structural, mechanical and thermal properties. From these results, a database was constructed to fit a Tersoff style interatomic potential suitable for atomistic simulations. These potentials were used to evaluate the lattice thermal conductivity of single crystals and the thermal resistance of simple grain boundaries. Finite element method (FEM) computations using atomistic results as inputs were performed with meshes constructed on SEM images thereby modeling the realistic microstructure. These continuum computations showed the reduction in thermal conductivity due to the grain boundary network.

  18. Searching for the mechanisms of change: a protocol for a realist review of batterer treatment programmes.

    PubMed

    Velonis, Alisa J; Cheff, Rebecca; Finn, Debbie; Davloor, Whitney; O'Campo, Patricia

    2016-04-06

    Conflicting results reported by evaluations of typical batterer intervention programmes leave many judicial officials and policymakers uncertain about the best way to respond to domestic violence, and whether to recommend and fund these programmes. Traditional evaluations and systematic reviews tend to focus predominantly on whether the programmes 'worked' (eg, reduced recidivism) often at the exclusion of understanding for whom they may or may not have worked, under what circumstances, and why. We are undertaking a realist review of the batterer treatment programme literature with the aim of addressing this gap. Keeping with the goals of realist review, our primary aims are to identify the theory that underlies these programmes, highlight the mechanisms that trigger changes in participant behaviour and finally explain why these programmes help some individuals reduce their use of violence and under what conditions they are effective or not effective. We begin by describing the process of perpetrator treatment, and by proposing an initial theoretical model of behaviour change that will be tested by our review. We then describe the criteria for inclusion of an evaluation into the review, the search strategy we will use to identify the studies, and the plan for data extraction and analysis. The results of this review will be written up using the RAMESES Guidelines for Realist Synthesis, and disseminated through peer-reviewed publications aimed at the practitioner community as well as presented at community forums, and at violence against women conferences. Ethics approval was not needed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  19. A Sustainable Evaluation Framework and Its Application

    ERIC Educational Resources Information Center

    Powell, Robert B.; Stern, Marc J.; Ardoin, Nicole

    2006-01-01

    This article presents a framework for developing internally sustainable evaluation systems for environmental education organizations, although the framework can be applied to other types of organizations. The authors developed a sustainable evaluation framework (SEF) with the intent of creating an evaluation system that could be self-administered…

  20. Potential of dynamic spectrum allocation in LTE macro networks

    NASA Astrophysics Data System (ADS)

    Hoffmann, H.; Ramachandra, P.; Kovács, I. Z.; Jorguseski, L.; Gunnarsson, F.; Kürner, T.

    2015-11-01

    In recent years Mobile Network Operators (MNOs) worldwide are extensively deploying LTE networks in different spectrum bands and utilising different bandwidth configurations. Initially, the deployment is coverage oriented with macro cells using the lower LTE spectrum bands. As the offered traffic (i.e. the requested traffic from the users) increases the LTE deployment evolves with macro cells expanded with additional capacity boosting LTE carriers in higher frequency bands complemented with micro or small cells in traffic hotspot areas. For MNOs it is crucial to use the LTE spectrum assets, as well as the installed network infrastructure, in the most cost efficient way. The dynamic spectrum allocation (DSA) aims at (de)activating the available LTE frequency carriers according to the temporal and spatial traffic variations in order to increase the overall LTE system performance in terms of total network capacity by reducing the interference. This paper evaluates the DSA potential of achieving the envisaged performance improvement and identifying in which system and traffic conditions the DSA should be deployed. A self-optimised network (SON) DSA algorithm is also proposed and evaluated. The evaluations have been carried out in a hexagonal and a realistic site-specific urban macro layout assuming a central traffic hotspot area surrounded with an area of lower traffic with a total size of approximately 8 × 8 km2. The results show that up to 47 % and up to 40 % possible DSA gains are achievable with regards to the carried system load (i.e. used resources) for homogenous traffic distribution with hexagonal layout and for realistic site-specific urban macro layout, respectively. The SON DSA algorithm evaluation in a realistic site-specific urban macro cell deployment scenario including realistic non-uniform spatial traffic distribution shows insignificant cell throughput (i.e. served traffic) performance gains. Nevertheless, in the SON DSA investigations, a gain of up to 25 % has been observed when analysing the resource utilisation in the non-hotspot cells.

  1. Adaptive Framework for Classification and Novel Class Detection over Evolving Data Streams with Limited Labeled Data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haque, Ahsanul; Khan, Latifur; Baron, Michael

    2015-09-01

    Most approaches to classifying evolving data streams either divide the stream of data into fixed-size chunks or use gradual forgetting to address the problems of infinite length and concept drift. Finding the fixed size of the chunks or choosing a forgetting rate without prior knowledge about time-scale of change is not a trivial task. As a result, these approaches suffer from a trade-off between performance and sensitivity. To address this problem, we present a framework which uses change detection techniques on the classifier performance to determine chunk boundaries dynamically. Though this framework exhibits good performance, it is heavily dependent onmore » the availability of true labels of data instances. However, labeled data instances are scarce in realistic settings and not readily available. Therefore, we present a second framework which is unsupervised in nature, and exploits change detection on classifier confidence values to determine chunk boundaries dynamically. In this way, it avoids the use of labeled data while still addressing the problems of infinite length and concept drift. Moreover, both of our proposed frameworks address the concept evolution problem by detecting outliers having similar values for the attributes. We provide theoretical proof that our change detection method works better than other state-of-the-art approaches in this particular scenario. Results from experiments on various benchmark and synthetic data sets also show the efficiency of our proposed frameworks.« less

  2. Conceptual framework for development of comprehensive e-health evaluation tool.

    PubMed

    Khoja, Shariq; Durrani, Hammad; Scott, Richard E; Sajwani, Afroz; Piryani, Usha

    2013-01-01

    The main objective of this study was to develop an e-health evaluation tool based on a conceptual framework including relevant theories for evaluating use of technology in health programs. This article presents the development of an evaluation framework for e-health programs. The study was divided into three stages: Stage 1 involved a detailed literature search of different theories and concepts on evaluation of e-health, Stage 2 plotted e-health theories to identify relevant themes, and Stage 3 developed a matrix of evaluation themes and stages of e-health programs. The framework identifies and defines different stages of e-health programs and then applies evaluation theories to each of these stages for development of the evaluation tool. This framework builds on existing theories of health and technology evaluation and presents a conceptual framework for developing an e-health evaluation tool to examine and measure different factors that play a definite role in the success of e-health programs. The framework on the horizontal axis divides e-health into different stages of program implementation, while the vertical axis identifies different themes and areas of consideration for e-health evaluation. The framework helps understand various aspects of e-health programs and their impact that require evaluation at different stages of the life cycle. The study led to the development of a new and comprehensive e-health evaluation tool, named the Khoja-Durrani-Scott Framework for e-Health Evaluation.

  3. Protocol: a realist review of user fee exemption policies for health services in Africa.

    PubMed

    Robert, Emilie; Ridde, Valéry; Marchal, Bruno; Fournier, Pierre

    2012-01-01

    Background Four years prior to the Millenium Development Goals (MDGs) deadline, low- and middle-income countries and international stakeholders are looking for evidence-based policies to improve access to healthcare for the most vulnerable populations. User fee exemption policies are one of the potential solutions. However, the evidence is disparate, and systematic reviews have failed to provide valuable lessons. The authors propose to produce an innovative synthesis of the available evidence on user fee exemption policies in Africa to feed the policy-making process. Methods The authors will carry out a realist review to answer the following research question: what are the outcomes of user fee exemption policies implemented in Africa? why do they produce such outcomes? and what contextual elements come into play? This type of review aims to understand how contextual elements influence the production of outcomes through the activation of specific mechanisms, in the form of context-mechanism-outcome configurations. The review will be conducted in five steps: (1) identifying with key stakeholders the mechanisms underlying user fee exemption policies to develop the analytical framework, (2) searching for and selecting primary data, (3) assessing the quality of evidence using the Mixed-Method Appraisal Tool, (4) extracting the data using the analytical framework and (5) synthesising the data in the form of context-mechanism-outcomes configurations. The output will be a middle-range theory specifying how user fee exemption policies work, for what populations and under what circumstances. Ethics and dissemination The two main target audiences are researchers who are looking for examples to implement a realist review, and policy-makers and international stakeholders looking for lessons learnt on user fee exemption. For the latter, a knowledge-sharing strategy involving local scientific and policy networks will be implemented. The study has been approved by the ethics committee of the CHUM Research Centre (CR-CHUM). It received funding from the Canadian Institutes of Health Research. The funders will not have any role in study design; collection, management, analysis, and interpretation of data; writing of the report and the decision to submit the report for publication, including who will have ultimate authority over each of these activities.

  4. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.

    PubMed

    Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel

    2018-02-20

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.

  5. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints

    PubMed Central

    Navet, Nicolas; Havet, Lionel

    2018-01-01

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489

  6. Percolation on shopping and cashback electronic commerce networks

    NASA Astrophysics Data System (ADS)

    Fu, Tao; Chen, Yini; Qin, Zhen; Guo, Liping

    2013-06-01

    Many realistic networks live in the form of multiple networks, including interacting networks and interdependent networks. Here we study percolation properties of a special kind of interacting networks, namely Shopping and Cashback Electronic Commerce Networks (SCECNs). We investigate two actual SCECNs to extract their structural properties, and develop a mathematical framework based on generating functions for analyzing directed interacting networks. Then we derive the necessary and sufficient condition for the absence of the system-wide giant in- and out- component, and propose arithmetic to calculate the corresponding structural measures in the sub-critical and supercritical regimes. We apply our mathematical framework and arithmetic to those two actual SCECNs to observe its accuracy, and give some explanations on the discrepancies. We show those structural measures based on our mathematical framework and arithmetic are useful to appraise the status of SCECNs. We also find that the supercritical regime of the whole network is maintained mainly by hyperlinks between different kinds of websites, while those hyperlinks between the same kinds of websites can only enlarge the sizes of in-components and out-components.

  7. A FSI computational framework for vascular physiopathology: A novel flow-tissue multiscale strategy.

    PubMed

    Bianchi, Daniele; Monaldo, Elisabetta; Gizzi, Alessio; Marino, Michele; Filippi, Simonetta; Vairo, Giuseppe

    2017-09-01

    A novel fluid-structure computational framework for vascular applications is herein presented. It is developed by combining the double multi-scale nature of vascular physiopathology in terms of both tissue properties and blood flow. Addressing arterial tissues, they are modelled via a nonlinear multiscale constitutive rationale, based only on parameters having a clear histological and biochemical meaning. Moreover, blood flow is described by coupling a three-dimensional fluid domain (undergoing physiological inflow conditions) with a zero-dimensional model, which allows to reproduce the influence of the downstream vasculature, furnishing a realistic description of the outflow proximal pressure. The fluid-structure interaction is managed through an explicit time-marching approach, able to accurately describe tissue nonlinearities within each computational step for the fluid problem. A case study associated to a patient-specific aortic abdominal aneurysmatic geometry is numerically investigated, highlighting advantages gained from the proposed multiscale strategy, as well as showing soundness and effectiveness of the established framework for assessing useful clinical quantities and risk indexes. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Integrated presentation of ecological risk from multiple stressors

    NASA Astrophysics Data System (ADS)

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-10-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  9. Integrated presentation of ecological risk from multiple stressors.

    PubMed

    Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman

    2016-10-26

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babun, Leonardo; Aksu, Hidayet; Uluagac, A. Selcuk

    The core vision of the smart grid concept is the realization of reliable two-­way communications between smart devices (e.g., IEDs, PLCs, PMUs). The benefits of the smart grid also come with tremendous security risks and new challenges in protecting the smart grid systems from cyber threats. Particularly, the use of untrusted counterfeit smart grid devices represents a real problem. Consequences of propagating false or malicious data, as well as stealing valuable user or smart grid state information from counterfeit devices are costly. Hence, early detection of counterfeit devices is critical for protecting smart grid’s components and users. To address thesemore » concerns, in this poster, we introduce our initial design of a configurable framework that utilize system call tracing, library interposition, and statistical techniques for monitoring and detection of counterfeit smart grid devices. In our framework, we consider six different counterfeit device scenarios with different smart grid devices and adversarial seZings. Our initial results on a realistic testbed utilizing actual smart-­grid GOOSE messages with IEC-­61850 communication protocol are very promising. Our framework is showing excellent rates on detection of smart grid counterfeit devices from impostors.« less

  11. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.

  12. The Cost-Income Compenent of Program Evaluation.

    ERIC Educational Resources Information Center

    Miner, Norris

    Cost-income studies are designed to serve two functions in instructional program evaluation. First, they act as the indicator of the economic value of a program. This economic value in conjunction with the other educational values needed in program evaluation allow for the most realistic appraisal of program worth. Second, if the studies show a…

  13. Mathematical modeling in realistic mathematics education

    NASA Astrophysics Data System (ADS)

    Riyanto, B.; Zulkardi; Putri, R. I. I.; Darmawijoyo

    2017-12-01

    The purpose of this paper is to produce Mathematical modelling in Realistics Mathematics Education of Junior High School. This study used development research consisting of 3 stages, namely analysis, design and evaluation. The success criteria of this study were obtained in the form of local instruction theory for school mathematical modelling learning which was valid and practical for students. The data were analyzed using descriptive analysis method as follows: (1) walk through, analysis based on the expert comments in the expert review to get Hypothetical Learning Trajectory for valid mathematical modelling learning; (2) analyzing the results of the review in one to one and small group to gain practicality. Based on the expert validation and students’ opinion and answers, the obtained mathematical modeling problem in Realistics Mathematics Education was valid and practical.

  14. The three stages of building and testing mid-level theories in a realist RCT: a theoretical and methodological case-example.

    PubMed

    Jamal, Farah; Fletcher, Adam; Shackleton, Nichola; Elbourne, Diana; Viner, Russell; Bonell, Chris

    2015-10-15

    Randomised controlled trials (RCTs) of social interventions are often criticised as failing to open the 'black box' whereby they only address questions about 'what works' without explaining the underlying processes of implementation and mechanisms of action, and how these vary by contextual characteristics of person and place. Realist RCTs are proposed as an approach to evaluation science that addresses these gaps while preserving the strengths of RCTs in providing evidence with strong internal validity in estimating effects. In the context of growing interest in designing and conducting realist trials, there is an urgent need to offer a worked example to provide guidance on how such an approach might be practically taken forward. The aim of this paper is to outline a three-staged theoretical and methodological process of undertaking a realist RCT using the example of the evaluation of a whole-school restorative intervention aiming to reduce aggression and bullying in English secondary schools. First, informed by the findings of our initial pilot trial and sociological theory, we elaborate our theory of change and specific a priori hypotheses about how intervention mechanisms interact with context to produce outcomes. Second, we describe how we will use emerging findings from the integral process evaluation within the RCT to refine, and add to, these a priori hypotheses before the collection of quantitative, follow-up data. Third, we will test our hypotheses using a combination of process and outcome data via quantitative analyses of effect mediation (examining mechanisms) and moderation (examining contextual contingencies). The results are then used to refine and further develop the theory of change. The aim of the realist RCT approach is thus not merely to assess whether the intervention is effective or not, but to develop empirically informed mid-range theory through a three-stage process. There are important implications for those involved with reporting and reviewing RCTs, including the use of new, iterative protocols. Current Controlled Trials ISRCTN10751359 (Registered 11 March 2014).

  15. Frameworks for evaluating health research capacity strengthening: a qualitative study

    PubMed Central

    2013-01-01

    Background Health research capacity strengthening (RCS) projects are often complex and hard to evaluate. In order to inform health RCS evaluation efforts, we aimed to describe and compare key characteristics of existing health RCS evaluation frameworks: their process of development, purpose, target users, structure, content and coverage of important evaluation issues. A secondary objective was to explore what use had been made of the ESSENCE framework, which attempts to address one such issue: harmonising the evaluation requirements of different funders. Methods We identified and analysed health RCS evaluation frameworks published by seven funding agencies between 2004 and 2012, using a mixed methods approach involving structured qualitative analyses of documents, a stakeholder survey and consultations with key contacts in health RCS funding agencies. Results The frameworks were intended for use predominantly by the organisations themselves, and most were oriented primarily towards funders’ internal organisational performance requirements. The frameworks made limited reference to theories that specifically concern RCS. Generic devices, such as logical frameworks, were typically used to document activities, outputs and outcomes, but with little emphasis on exploring underlying assumptions or contextual constraints. Usage of the ESSENCE framework appeared limited. Conclusions We believe that there is scope for improving frameworks through the incorporation of more accessible information about how to do evaluation in practice; greater involvement of stakeholders, following evaluation capacity building principles; greater emphasis on explaining underlying rationales of frameworks; and structuring frameworks so that they separate generic and project-specific aspects of health RCS evaluation. The third and fourth of these improvements might assist harmonisation. PMID:24330628

  16. Understanding the motivation and performance of community health volunteers involved in the delivery of health programmes in Kampala, Uganda: a realist evaluation protocol.

    PubMed

    Vareilles, Gaëlle; Pommier, Jeanine; Kane, Sumit; Pictet, Gabriel; Marchal, Bruno

    2015-01-28

    The recruitment of community health volunteers to support the delivery of health programmes is a well-established approach in many countries, particularly where health services are not readily available. However, studies on management of volunteers are scarce and current research on human resource management of volunteers faces methodological challenges. This paper presents the protocol of a realist evaluation that aims at identifying the factors influencing the performance of community health volunteers involved in the delivery of a Red Cross immunisation programme in Kampala (Uganda) with a specific focus on motivation. The realist evaluation cycle structures the protocol. To develop the theoretical basis for the evaluation, the authors conducted interviews and reviewed the literature on community health volunteers' performance, management and organisational behaviour. This led to the formulation of the initial programme theory, which links the intervention inputs (capacity-building strategies) to the expected outcomes (positive work behaviour) with mechanisms that point in the direction of drivers of motivation. The contextual elements include components such as organisational culture, resource availability, etc. A case study design will be adopted. We define a case as a Red Cross branch, run by a programme manager, and will select two cases at the district level in Kampala. Mixed methods will be used in data collection, including individual interviews of volunteers, participant observation and document review. The thematic analysis will be based on the initial programme theory and will seek for context-mechanism-outcome configurations. Findings from the two cases will be compared. We discuss the scope for applying realist evaluation and the methodological challenges we encountered in developing this protocol. The study was approved by the Ethical Committee at Rennes University Hospital, France. Results will be published in scientific journals, and communicated to respondents and relevant institutions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. A moist Boussinesq shallow water equations set for testing atmospheric models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zerroukat, M., E-mail: mohamed.zerroukat@metoffice.gov.uk; Allen, T.

    The shallow water equations have long been used as an initial test for numerical methods applied to atmospheric models with the test suite of Williamson et al. being used extensively for validating new schemes and assessing their accuracy. However the lack of physics forcing within this simplified framework often requires numerical techniques to be reworked when applied to fully three dimensional models. In this paper a novel two-dimensional shallow water equations system that retains moist processes is derived. This system is derived from three-dimensional Boussinesq approximation of the hydrostatic Euler equations where, unlike the classical shallow water set, we allowmore » the density to vary slightly with temperature. This results in extra (or buoyancy) terms for the momentum equations, through which a two-way moist-physics dynamics feedback is achieved. The temperature and moisture variables are advected as separate tracers with sources that interact with the mean-flow through a simplified yet realistic bulk moist-thermodynamic phase-change model. This moist shallow water system provides a unique tool to assess the usually complex and highly non-linear dynamics–physics interactions in atmospheric models in a simple yet realistic way. The full non-linear shallow water equations are solved numerically on several case studies and the results suggest quite realistic interaction between the dynamics and physics and in particular the generation of cloud and rain. - Highlights: • Novel shallow water equations which retains moist processes are derived from the three-dimensional hydrostatic Boussinesq equations. • The new shallow water set can be seen as a more general one, where the classical equations are a special case of these equations. • This moist shallow water system naturally allows a feedback mechanism from the moist physics increments to the momentum via buoyancy. • Like full models, temperature and moistures are advected as tracers that interact through a simplified yet realistic phase-change model. • This model is a unique tool to test numerical methods for atmospheric models, and physics–dynamics coupling, in a very realistic and simple way.« less

  18. The GENIE Neutrino Monte Carlo Generator: Physics and User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andreopoulos, Costas; Barry, Christopher; Dytman, Steve

    2015-10-20

    GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of itsmore » physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.« less

  19. Integrated Agent-Based and Production Cost Modeling Framework for Renewable Energy Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallo, Giulia

    The agent-based framework for renewable energy studies (ARES) is an integrated approach that adds an agent-based model of industry actors to PLEXOS and combines the strengths of the two to overcome their individual shortcomings. It can examine existing and novel wholesale electricity markets under high penetrations of renewables. ARES is demonstrated by studying how increasing levels of wind will impact the operations and the exercise of market power of generation companies that exploit an economic withholding strategy. The analysis is carried out on a test system that represents the Electric Reliability Council of Texas energy-only market in the year 2020.more » The results more realistically reproduce the operations of an energy market under different and increasing penetrations of wind, and ARES can be extended to address pressing issues in current and future wholesale electricity markets.« less

  20. Integrated Agent-Based and Production Cost Modeling Framework for Renewable Energy Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallo, Giulia

    The agent-based framework for renewable energy studies (ARES) is an integrated approach that adds an agent-based model of industry actors to PLEXOS and combines the strengths of the two to overcome their individual shortcomings. It can examine existing and novel wholesale electricity markets under high penetrations of renewables. ARES is demonstrated by studying how increasing levels of wind will impact the operations and the exercise of market power of generation companies that exploit an economic withholding strategy. The analysis is carried out on a test system that represents the Electric Reliability Council of Texas energy-only market in the year 2020.more » The results more realistically reproduce the operations of an energy market under different and increasing penetrations of wind, and ARES can be extended to address pressing issues in current and future wholesale electricity markets.« less

  1. On Market-Based Coordination of Thermostatically Controlled Loads With User Preference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Sen; Zhang, Wei; Lian, Jianming

    2014-12-15

    This paper presents a market-based control framework to coordinate a group of autonomous Thermostatically Controlled Loads (TCL) to achieve the system-level objectives with pricing incentives. The problem is formulated as maximizing the social welfare subject to feeder power constraint. It allows the coordinator to affect the aggregated power of a group of dynamical systems, and creates an interactive market where the users and the coordinator cooperatively determine the optimal energy allocation and energy price. The optimal pricing strategy is derived, which maximizes social welfare while respecting the feeder power constraint. The bidding strategy is also designed to compute the optimalmore » price in real time (e.g., every 5 minutes) based on local device information. The coordination framework is validated with realistic simulations in GridLab-D. Extensive simulation results demonstrate that the proposed approach effectively maximizes the social welfare and decreases power congestion at key times.« less

  2. A survey on adaptive engine technology for serious games

    NASA Astrophysics Data System (ADS)

    Rasim, Langi, Armein Z. R.; Munir, Rosmansyah, Yusep

    2016-02-01

    Serious Games has become a priceless tool in learning because it can simulate abstract concept to appear more realistic. The problem faced is that the players have different ability in playing the games. This causes the players to become frustrated if the game is too difficult or to get bored if it is too easy. Serious games have non-player character (NPC) in it. The NPC should be able to adapt to the players in such a way so that the players can feel comfortable in playing the games. Because of that, serious games development must involve an adaptive engine, which is by applying a learning machine that can adapt to different players. The development of adaptive engine can be viewed in terms of the frameworks and the algorithms. Frameworks include rules based, plan based, organization description based, proficiency of player based, and learning style and cognitive state based. Algorithms include agents based and non-agent based

  3. OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods

    NASA Technical Reports Server (NTRS)

    Heath, Christopher M.; Gray, Justin S.

    2012-01-01

    The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.

  4. Collapsing radiating stars with various equations of state

    NASA Astrophysics Data System (ADS)

    Brassel, Byron P.; Goswami, Rituparno; Maharaj, Sunil D.

    2017-06-01

    We study the gravitational collapse of radiating stars in the context of the cosmic censorship conjecture. We consider a generalized Vaidya spacetime with three concentric regions. The local internal atmosphere is a two-component system consisting of standard pressure-free, null radiation and an additional string fluid with energy density and nonzero pressure obeying all physically realistic energy conditions. The middle region is purely radiative which matches to a third region which is the Schwarzschild exterior. We outline the general mathematical framework to study the conditions on the mass function so that future-directed nonspacelike geodesics can terminate at the singularity in the past. Mass functions for several equations of state are analyzed using this framework and it is shown that the collapse in each case terminates at a locally naked central singularity. We calculate the strength of these singularities to show that they are strong curvature singularities which implies that no extension of spacetime through them is possible.

  5. Implicit solvation model for density-functional study of nanocrystal surfaces and reaction pathways

    NASA Astrophysics Data System (ADS)

    Mathew, Kiran; Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Arias, T. A.; Hennig, Richard G.

    2014-02-01

    Solid-liquid interfaces are at the heart of many modern-day technologies and provide a challenge to many materials simulation methods. A realistic first-principles computational study of such systems entails the inclusion of solvent effects. In this work, we implement an implicit solvation model that has a firm theoretical foundation into the widely used density-functional code Vienna ab initio Software Package. The implicit solvation model follows the framework of joint density functional theory. We describe the framework, our algorithm and implementation, and benchmarks for small molecular systems. We apply the solvation model to study the surface energies of different facets of semiconducting and metallic nanocrystals and the SN2 reaction pathway. We find that solvation reduces the surface energies of the nanocrystals, especially for the semiconducting ones and increases the energy barrier of the SN2 reaction.

  6. Greenhouse Development Rights. An approach to the global climate regime that takes climate protection seriously while also preserving the right to human development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Athanasiou, T.; Kartha, S.; Baer, P.

    2006-11-15

    This brief paper introduces a new approach to the global climate regime, one designed to recognize the urgency of the climate crisis, while at the same time embracing the fundamental right to human development. This 'Greenhouse Development Rights' approach is not primarily defended on ethical grounds. It's core justification, rather, is a realistic one - our claim is that this approach, or something like it, is needed if we're to break the global impasse over developmental equity in a climate constrained world. We put forward this new approach not because we believe that it will be readily adopted as themore » foundation of the post-2012 regime. Rather, we intend it as a standard of comparison, a reference framework that marks out the steps that must be part of an effective climate regime, while refusing to prejudge which of them will or will not ultimately be deemed politically acceptable. Against this reference framework, given regime proposals can be measured to determine how realistic they are, from the standpoint of genuinely addressing the North/South impasse and having a chance of preventing a climate catastrophe. The climate crisis, as most everyone in the climate community knows, is upon us. Still, the pace of our response has been profoundly inadequate, so this paper will begin with the blunt truth. The science now tells us that we're pushing beyond mere 'dangerous anthropogenic interference with the climate system', and are rather on the verge of committing to catastrophic interference.« less

  7. A novel Lagrangian approach for the stable numerical simulation of fault and fracture mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franceschini, Andrea; Ferronato, Massimiliano, E-mail: massimiliano.ferronato@unipd.it; Janna, Carlo

    The simulation of the mechanics of geological faults and fractures is of paramount importance in several applications, such as ensuring the safety of the underground storage of wastes and hydrocarbons or predicting the possible seismicity triggered by the production and injection of subsurface fluids. However, the stable numerical modeling of ground ruptures is still an open issue. The present work introduces a novel formulation based on the use of the Lagrange multipliers to prescribe the constraints on the contact surfaces. The variational formulation is modified in order to take into account the frictional work along the activated fault portion accordingmore » to the principle of maximum plastic dissipation. The numerical model, developed in the framework of the Finite Element method, provides stable solutions with a fast convergence of the non-linear problem. The stabilizing properties of the proposed model are emphasized with the aid of a realistic numerical example dealing with the generation of ground fractures due to groundwater withdrawal in arid regions. - Highlights: • A numerical model is developed for the simulation of fault and fracture mechanics. • The model is implemented in the framework of the Finite Element method and with the aid of Lagrange multipliers. • The proposed formulation introduces a new contribution due to the frictional work on the portion of activated fault. • The resulting algorithm is highly non-linear as the portion of activated fault is itself unknown. • The numerical solution is validated against analytical results and proves to be stable also in realistic applications.« less

  8. The effectiveness and cost-effectiveness of shared care: protocol for a realist review.

    PubMed

    Hardwick, Rebecca; Pearson, Mark; Byng, Richard; Anderson, Rob

    2013-02-12

    Shared care (an enhanced information exchange over and above routine outpatient letters) is commonly used to improve care coordination and communication between a specialist and primary care services for people with long-term conditions. Evidence of the effectiveness and cost-effectiveness of shared care is mixed. Informed decision-making for targeting shared care requires a greater understanding of how it works, for whom it works, in what contexts and why. This protocol outlines how realist review methods can be used to synthesise evidence on shared care for long-term conditions.A further aim of the review is to explore economic evaluations of shared care. Economic evaluations are difficult to synthesise due to problems in accounting for contextual differences that impact on resource use and opportunity costs. Realist review methods have been suggested as a way to overcome some of these issues, so this review will also assess whether realist review methods are amenable to synthesising economic evidence. Database and web searching will be carried out in order to find relevant evidence to develop and test programme theories about how shared care works. The review will have two phases. Phase 1 will concentrate on the contextual conditions and mechanisms that influence how shared care works, in order to develop programme theories, which partially explain how it works. Phase 2 will focus on testing these programme theories. A Project Reference Group made up of health service professionals and people with actual experience of long-term conditions will be used to ground the study in real-life experience. Review findings will be disseminated through local and sub-national networks for integrated care and long-term conditions. This realist review will explore why and for whom shared care works, in order to support decision-makers working to improve the effectiveness of care for people outside hospital. The development of realist review methods to take into account cost and cost-effectiveness evidence is particularly innovative and challenging, and if successful will offer a new approach to synthesising economic evidence. This systematic review protocol is registered on the PROSPERO database (registration number: CRD42012002842).

  9. Physics process level discrimination of detections for GATE: assessment of contamination in SPECT and spurious activity in PET.

    PubMed

    De Beenhouwer, Jan; Staelens, Steven; Vandenberghe, Stefaan; Verhaeghe, Jeroen; Van Holen, Roel; Rault, Erwann; Lemahieu, Ignace

    2009-04-01

    The GEANT4 application for tomographic emission (GATE) is one of the most detailed Monte Carlo simulation tools for SPECT and PET. It allows for realistic phantoms, complex decay schemes, and a large variety of detector geometries. However, only a fraction of the information in each particle history is available for postprocessing. In order to extend the analysis capabilities of GATE, a flexible framework was developed. This framework allows all detected events to be subdivided according to their type: In PET, true coincidences from others, and in SPECT, geometrically collimated photons from others. The framework of the authors can be applied to any isotope, phantom, and detector geometry available in GATE. It is designed to enhance the usability of GATE for the study of contamination and for the investigation of the properties of current and future prototype detectors. The authors apply the framework to a case study of Bexxar, first assuming labeling with 124I, then with 131I. It is shown that with 124I PET, results with an optimized window improve upon those with the standard window but achieve less than half of the ideal improvement. Nevertheless, 124I PET shows improved resolution compared to 131I SPECT with triple-energy-window scatter correction.

  10. Clinical simulation practise framework.

    PubMed

    Khalili, Hossein

    2015-02-01

    Historically, simulation has mainly been used to teach students hands-on skills in a relatively safe environment. With changes in the patient population, professional regulations and clinical environments, clinical simulation practise (CSP) must assist students to integrate and apply their theoretical knowledge and skills with their critical thinking, clinical judgement, prioritisation, problem solving, decision making, and teamwork skills to provide holistic care and treatment to their patients. CSP holds great potential to derive a positive transformation in students' transition into the workplace, by associating and consolidating learning from classrooms to clinical settings, and creating bridges between theory and practice. For CSP to be successful in filling the gap, the design and management of the simulation is crucial. In this article a new framework called 'Clinical simulation practise framework: A knowledge to action strategy in health professional education' is being introduced that aims to assist educators and curriculum developers in designing and managing their simulations. This CSP framework theorises that simulation as an experiential educational tool could improve students' competence, confidence and collaboration in performing professional practice in real settings if the CSP provides the following three dimensions: (1) a safe, positive, reflective and fun simulated learning environment; (2) challenging, but realistic, and integrated simulated scenarios; and (3) interactive, inclusive, interprofessional patient-centred simulated practise. © 2015 John Wiley & Sons Ltd.

  11. Praxis and reflexivity for interprofessional education: towards an inclusive theoretical framework for learning.

    PubMed

    Hutchings, Maggie; Scammell, Janet; Quinney, Anne

    2013-09-01

    While there is growing evidence of theoretical perspectives adopted in interprofessional education, learning theories tend to foreground the individual, focusing on psycho-social aspects of individual differences and professional identity to the detriment of considering social-structural factors at work in social practices. Conversely socially situated practice is criticised for being context-specific, making it difficult to draw generalisable conclusions for improving interprofessional education. This article builds on a theoretical framework derived from earlier research, drawing on the dynamics of Dewey's experiential learning theory and Archer's critical realist social theory, to make a case for a meta-theoretical framework enabling social-constructivist and situated learning theories to be interlinked and integrated through praxis and reflexivity. Our current analysis is grounded in an interprofessional curriculum initiative mediated by a virtual community peopled by health and social care users. Student perceptions, captured through quantitative and qualitative data, suggest three major disruptive themes, creating opportunities for congruence and disjuncture and generating a model of zones of interlinked praxis associated with professional differences and identity, pedagogic strategies and technology-mediated approaches. This model contributes to a framework for understanding the complexity of interprofessional learning and offers bridges between individual and structural factors for engaging with the enablements and constraints at work in communities of practice and networks for interprofessional education.

  12. The nature of nurture and the future of evodevo: toward a theory of developmental evolution.

    PubMed

    Moczek, Armin P

    2012-07-01

    This essay has three parts. First, I posit that much research in contemporary evodevo remains steeped in a traditional framework that views traits and trait differences as being caused by genes and genetic variation, and the environment as providing an external context in which development and evolution unfold. Second, I discuss three attributes of organismal development and evolution, broadly applicable to all organisms and traits that call into question the usefulness of gene- and genome-centric views of development and evolution. I then focus on the third and main aim of this essay and ask: what conceptual and empirical opportunities exist that would permit evodevo research to transcend the traditional boundaries inherited from its parent disciplines and to move toward the development of a more comprehensive and realistic theory of developmental evolution? Here, I focus on three conceptual frameworks, the theory of facilitated variation, the theory of evolution by genetic accommodation, and the theory of niche construction. I conclude that combined they provide a rich, interlocking framework within which to revise existing and develop novel empirical approaches toward a better understanding of the nature of developmental evolution. Examples of such approaches are highlighted, and the consequences of expanding existing frameworks are discussed.

  13. The role of health impact assessment in Phase V of the Healthy Cities European Network.

    PubMed

    Simos, Jean; Spanswick, Lucy; Palmer, Nicola; Christie, Derek

    2015-06-01

    Health impact assessment (HIA) is a prospective decision-making aid tool that aims to improve the quality of policies, programmes or projects through recommendations that promote health. It identifies how and through which pathways a decision can impact a wide range of health determinants and seeks to define the distribution of effects within populations, thereby raising the issue of equity. HIA was introduced to the WHO European Healthy Cities Network as one of its four core themes during the Phase IV (2004-08). Here we present an evaluation of the use of HIA during Phase V (2009-13), where HIA was linked with the overarching theme of health and health equity in all local policies and a requirement regarding capacity building. The evaluation was based on 10 case studies contributed by 9 Healthy Cities in five countries (France, Hungary, Italy, Spain and the UK). A Realist Evaluation framework was used to collect and aggregate data obtained through three methods: an HIA factors analysis, a case-study template analysis using Nvivo software and a detailed questionnaire. The main conclusion is that HIA significantly helps promote Health in All Policies (HiAP) and sustainability in Healthy Cities. It is recommended that all Healthy City candidates to Phase VI (2014-18) of the WHO Healthy Cities European Network effectively adopt HIA and HiAP. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Evaluating Multispectral Snowpack Reflectivity With Changing Snow Correlation Lengths

    NASA Technical Reports Server (NTRS)

    Kang, Do Hyuk; Barros, Ana P.; Kim, Edward J.

    2016-01-01

    This study investigates the sensitivity of multispectral reflectivity to changing snow correlation lengths. Matzler's ice-lamellae radiative transfer model was implemented and tested to evaluate the reflectivity of snow correlation lengths at multiple frequencies from the ultraviolet (UV) to the microwave bands. The model reveals that, in the UV to infrared (IR) frequency range, the reflectivity and correlation length are inversely related, whereas reflectivity increases with snow correlation length in the microwave frequency range. The model further shows that the reflectivity behavior can be mainly attributed to scattering rather than absorption for shallow snowpacks. The largest scattering coefficients and reflectivity occur at very small correlation lengths (approximately 10(exp -5 m) for frequencies higher than the IR band. In the microwave range, the largest scattering coefficients are found at millimeter wavelengths. For validation purposes, the ice-lamella model is coupled with a multilayer snow physics model to characterize the reflectivity response of realistic snow hydrological processes. The evolution of the coupled model simulated reflectivities in both the visible and the microwave bands is consistent with satellite-based reflectivity observations in the same frequencies. The model results are also compared with colocated in situ snow correlation length measurements (Cold Land Processes Field Experiment 2002-2003). The analysis and evaluation of model results indicate that the coupled multifrequency radiative transfer and snow hydrology modeling system can be used as a forward operator in a data-assimilation framework to predict the status of snow physical properties, including snow correlation length.

  15. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes

    PubMed Central

    Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam

    2012-01-01

    OBJECTIVE: To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. DESIGN: Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. SETTING: Paediatric residency program at BC Children’s Hospital, Vancouver, British Columbia. INTERVENTIONS: The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. RESULTS: A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. CONCLUSIONS: A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes. PMID:23372405

  16. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  17. A mandala of faculty development: using theory-based evaluation to explore contexts, mechanisms and outcomes.

    PubMed

    Onyura, Betty; Ng, Stella L; Baker, Lindsay R; Lieff, Susan; Millar, Barbara-Ann; Mori, Brenda

    2017-03-01

    Demonstrating the impact of faculty development, is an increasingly mandated and ever elusive goal. Questions have been raised about the adequacy of current approaches. Here, we integrate realist and theory-driven evaluation approaches, to evaluate an intensive longitudinal program. Our aim is to elucidate how faculty development can work to support a range of outcomes among individuals and sub-systems in the academic health sciences. We conducted retrospective framework analysis of qualitative focus group data gathered from 79 program participants (5 cohorts) over a 10-year period. Additionally, we conducted follow-up interviews with 15 alumni. We represent the interactive relationships among contexts, mechanisms, and outcomes as a "mandala" of faculty development. The mandala illustrates the relationship between the immediate program context, and the broader institutional context of academic health sciences, and identifies relevant change mechanisms. Four primary mechanisms were collaborative-reflection, self-reflection and self-regulation, relationship building, and pedagogical knowledge acquisition. Individual outcomes, including changed teaching practices, are described. Perhaps most interestingly, secondary mechanisms-psychological and structural empowerment-contributed to institutional outcomes through participants' engagement in change leadership in their local contexts. Our theoretically informed evaluation approach models how faculty development, situated in appropriate institutional contexts, can trigger mechanisms that yield a range of benefits for faculty and their institutions. The adopted methods hold potential as a way to demonstrate the often difficult-to-measure outcomes of educational programs, and allow for critical examination as to how and whether faculty development programs can accomplish their espoused goals.

  18. Health services research evaluation principles. Broadening a general framework for evaluating health information technology.

    PubMed

    Sockolow, P S; Crawford, P R; Lehmann, H P

    2012-01-01

    Our forthcoming national experiment in increased health information technology (HIT) adoption funded by the American Recovery and Reinvestment Act of 2009 will require a comprehensive approach to evaluating HIT. The quality of evaluation studies of HIT to date reveals a need for broader evaluation frameworks that limits the generalizability of findings and the depth of lessons learned. Develop an informatics evaluation framework for health information technology (HIT) integrating components of health services research (HSR) evaluation and informatics evaluation to address identified shortcomings in available HIT evaluation frameworks. A systematic literature review updated and expanded the exhaustive review by Ammenwerth and deKeizer (AdK). From retained studies, criteria were elicited and organized into classes within a framework. The resulting Health Information Technology Research-based Evaluation Framework (HITREF) was used to guide clinician satisfaction survey construction, multi-dimensional analysis of data, and interpretation of findings in an evaluation of a vanguard community health care EHR. The updated review identified 128 electronic health record (EHR) evaluation studies and seven evaluation criteria not in AdK: EHR Selection/Development/Training; Patient Privacy Concerns; Unintended Consequences/ Benefits; Functionality; Patient Satisfaction with EHR; Barriers/Facilitators to Adoption; and Patient Satisfaction with Care. HITREF was used productively and was a complete evaluation framework which included all themes that emerged. We can recommend to future EHR evaluators that they consider adding a complete, research-based HIT evaluation framework, such as HITREF, to their evaluation tools suite to monitor HIT challenges as the federal government strives to increase HIT adoption.

  19. Matching the Word Processor to the Job.

    ERIC Educational Resources Information Center

    Synder, Carin

    1982-01-01

    The intelligent purchase of school office equipment, specifically word processors, typewriters, calculators, and furniture, requires analysis of present needs and a realistic evaluation of future needs. (MLF)

  20. Analysis of the Impact of Realistic Wind Size Parameter on the Delft3D Model

    NASA Astrophysics Data System (ADS)

    Washington, M. H.; Kumar, S.

    2017-12-01

    The wind size parameter, which is the distance from the center of the storm to the location of the maximum winds, is currently a constant in the Delft3D model. As a result, the Delft3D model's output prediction of the water levels during a storm surge are inaccurate compared to the observed data. To address these issues, an algorithm to calculate a realistic wind size parameter for a given hurricane was designed and implemented using the observed water-level data for Hurricane Matthew. A performance evaluation experiment was conducted to demonstrate the accuracy of the model's prediction of water levels using the realistic wind size input parameter compared to the default constant wind size parameter for Hurricane Matthew, with the water level data observed from October 4th, 2016 to October 9th, 2016 from National Oceanic and Atmospheric Administration (NOAA) as a baseline. The experimental results demonstrate that the Delft3D water level output for the realistic wind size parameter, compared to the default constant size parameter, matches more accurately with the NOAA reference water level data.

  1. PROPEL: implementation of an evidence based pelvic floor muscle training intervention for women with pelvic organ prolapse: a realist evaluation and outcomes study protocol.

    PubMed

    Maxwell, Margaret; Semple, Karen; Wane, Sarah; Elders, Andrew; Duncan, Edward; Abhyankar, Purva; Wilkinson, Joyce; Tincello, Douglas; Calveley, Eileen; MacFarlane, Mary; McClurg, Doreen; Guerrero, Karen; Mason, Helen; Hagen, Suzanne

    2017-12-22

    Pelvic Organ Prolapse (POP) is estimated to affect 41%-50% of women aged over 40. Findings from the multi-centre randomised controlled "Pelvic Organ Prolapse PhysiotherapY" (POPPY) trial showed that individualised pelvic floor muscle training (PFMT) was effective in reducing symptoms of prolapse, improved quality of life and showed clear potential to be cost-effective. However, provision of PFMT for prolapse continues to vary across the UK, with limited numbers of women's health physiotherapists specialising in its delivery. Implementation of this robust evidence from the POPPY trial will require attention to different models of delivery (e.g. staff skill mix) to fit with differing care environments. A Realist Evaluation (RE) of implementation and outcomes of PFMT delivery in contrasting NHS settings will be conducted using multiple case study sites. Involving substantial local stakeholder engagement will permit a detailed exploration of how local sites make decisions on how to deliver PFMT and how these lead to service change. The RE will track how implementation is working; identify what influences outcomes; and, guided by the RE-AIM framework, will collect robust outcomes data. This will require mixed methods data collection and analysis. Qualitative data will be collected at four time-points across each site to understand local contexts and decisions regarding options for intervention delivery and to monitor implementation, uptake, adherence and outcomes. Patient outcome data will be collected at baseline, six months and one year follow-up for 120 women. Primary outcome will be the Pelvic Organ Prolapse Symptom Score (POP-SS). An economic evaluation will assess the costs and benefits associated with different delivery models taking account of further health care resource use by the women. Cost data will be combined with the primary outcome in a cost effectiveness analysis, and the EQ-5D-5L data in a cost utility analysis for each of the different models of delivery. Study of the implementation of varying models of service delivery of PFMT across contrasting sites combined with outcomes data and a cost effectiveness analysis will provide insight into the implementation and value of different models of PFMT service delivery and the cost benefits to the NHS in the longer term.

  2. A Mandala of Faculty Development: Using Theory-Based Evaluation to Explore Contexts, Mechanisms and Outcomes

    ERIC Educational Resources Information Center

    Onyura, Betty; Ng, Stella L.; Baker, Lindsay R.; Lieff, Susan; Millar, Barbara-Ann; Mori, Brenda

    2017-01-01

    Demonstrating the impact of faculty development, is an increasingly mandated and ever elusive goal. Questions have been raised about the adequacy of current approaches. Here, we integrate realist and theory-driven evaluation approaches, to evaluate an intensive longitudinal program. Our aim is to elucidate how faculty development can work to…

  3. Framework for Evaluating the Impact of Advanced Practice Nursing Roles.

    PubMed

    Bryant-Lukosius, Denise; Spichiger, Elisabeth; Martin, Jacqueline; Stoll, Hansruedi; Kellerhals, Sabine Degen; Fliedner, Monica; Grossmann, Florian; Henry, Morag; Herrmann, Luzia; Koller, Antje; Schwendimann, René; Ulrich, Anja; Weibel, Lukas; Callens, Betty; De Geest, Sabina

    2016-03-01

    To address the gap in evidence-based information required to support the development of advanced practice nursing (APN) roles in Switzerland, stakeholders identified the need for guidance to generate strategic evaluation data. This article describes an evaluation framework developed to inform decisions about the effective utilization of APN roles across the country. A participatory approach was used by an international group of stakeholders. Published literature and an evidenced-based framework for introducing APN roles were analyzed and applied to define the purpose, target audiences, and essential elements of the evaluation framework. Through subsequent meetings and review by an expert panel, the framework was developed and refined. A framework to evaluate different types of APN roles as they evolve to meet dynamic population health, practice setting, and health system needs was created. It includes a matrix of key concepts to guide evaluations across three stages of APN role development: introduction, implementation, and long-term sustainability. For each stage, evaluation objectives and questions examining APN role structures, processes, and outcomes from different perspectives (e.g., patients, providers, managers, policy-makers) were identified. A practical, robust framework based on well-established evaluation concepts and current understanding of APN roles can be used to conduct systematic evaluations. The evaluation framework is sufficiently generic to allow application in developed countries globally, both for evaluation as well as research purposes. © 2016 Sigma Theta Tau International.

  4. Using simulated fluorescence cell micrographs for the evaluation of cell image segmentation algorithms.

    PubMed

    Wiesmann, Veit; Bergler, Matthias; Palmisano, Ralf; Prinzen, Martin; Franz, Daniela; Wittenberg, Thomas

    2017-03-18

    Manual assessment and evaluation of fluorescent micrograph cell experiments is time-consuming and tedious. Automated segmentation pipelines can ensure efficient and reproducible evaluation and analysis with constant high quality for all images of an experiment. Such cell segmentation approaches are usually validated and rated in comparison to manually annotated micrographs. Nevertheless, manual annotations are prone to errors and display inter- and intra-observer variability which influence the validation results of automated cell segmentation pipelines. We present a new approach to simulate fluorescent cell micrographs that provides an objective ground truth for the validation of cell segmentation methods. The cell simulation was evaluated twofold: (1) An expert observer study shows that the proposed approach generates realistic fluorescent cell micrograph simulations. (2) An automated segmentation pipeline on the simulated fluorescent cell micrographs reproduces segmentation performances of that pipeline on real fluorescent cell micrographs. The proposed simulation approach produces realistic fluorescent cell micrographs with corresponding ground truth. The simulated data is suited to evaluate image segmentation pipelines more efficiently and reproducibly than it is possible on manually annotated real micrographs.

  5. Application and Evaluation of MODIS LAI, FPAR, and Albedo ...

    EPA Pesticide Factsheets

    MODIS vegetation and albedo products provide a more realistic representation of surface conditions for input to the WRF/CMAQ modeling system. However, the initial evaluation of ingesting MODIS data into the system showed mixed results, with increased bias and error for 2-m temperature and reduced bias and error for 2-m mixing ratio. Recently, the WRF/CMAQ land surface and boundary laywer processes have been updated. In this study, MODIS vegetation and albedo data are input to the updated WRF/CMAQ meteorology and air quality simulations for 2006 over a North American (NA) 12-km domain. The evaluation of the simulation results shows that the updated WRF/CMAQ system improves 2-m temperature estimates over the pre-update base modeling system estimates. The MODIS vegetation input produces a realistic spring green-up that progresses through time from the south to north. Overall, MODIS input reduces 2-m mixing ration bias during the growing season. The NA west shows larger positive O3 bias during the growing season because of reduced gas phase deposition resulting from lower O3 deposition velocities driven by reduced vegetation cover. The O3 bias increase associated with the realistic vegetation representation indicates that further improvement may be needed in the WRF/CMAQ system. The National Exposure Research Laboratory’s Atmospheric Modeling Division (AMAD) conducts research in support of EPA’s mission to protect human health and the environment. AMAD’s rese

  6. Modeling and simulation of high-speed wake flows

    NASA Astrophysics Data System (ADS)

    Barnhardt, Michael Daniel

    High-speed, unsteady flows represent a unique challenge in computational hypersonics research. They are found in nearly all applications of interest, including the wakes of reentry vehicles, RCS jet interactions, and scramjet combustors. In each of these examples, accurate modeling of the flow dynamics plays a critical role in design performance. Nevertheless, literature surveys reveal that very little modern research effort has been made toward understanding these problems. The objective of this work is to synthesize current computational methods for high-speed flows with ideas commonly used to model low-speed, turbulent flows in order to create a framework by which we may reliably predict unsteady, hypersonic flows. In particular, we wish to validate the new methodology for the case of a turbulent wake flow at reentry conditions. Currently, heat shield designs incur significant mass penalties due to the large margins applied to vehicle afterbodies in lieu of a thorough understanding of the wake aerothermodynamics. Comprehensive validation studies are required to accurately quantify these modeling uncertainties. To this end, we select three candidate experiments against which we evaluate the accuracy of our methodology. The first set of experiments concern the Mars Science Laboratory (MSL) parachute system and serve to demonstrate that our implementation produces results consistent with prior studies at supersonic conditions. Second, we use the Reentry-F flight test to expand the application envelope to realistic flight conditions. Finally, in the last set of experiments, we examine a spherical capsule wind tunnel configuration in order to perform a more detailed analysis of a realistic flight geometry. In each case, we find that current 1st order in time, 2nd order in space upwind numerical methods are sufficiently accurate to predict statistical measurements: mean, RMS, standard deviation, and so forth. Further potential gains in numerical accuracy are demonstrated using a new class of flux evaluation schemes in combination with 2nd order dual-time stepping. For cases with transitional or turbulent Reynolds numbers, we show that the detached eddy simulation (DES) method holds clear advantage over heritage RANS methods. From this, we conclude that the current methodology is sufficient to predict heating of external, reentry-type applications within experimental uncertainty.

  7. Investigating the organisational impacts of quality improvement: a protocol for a realist evaluation of improvement approaches drawing on the Resource Based View of the Firm.

    PubMed

    Burton, Christopher R; Rycroft Malone, Jo; Robert, Glenn; Willson, Alan; Hopkins, Angela

    2014-07-31

    Little is understood about the role of quality improvement in enabling health organisations to survive and thrive in the contemporary context of financial and economic challenges. We will draw on the theoretical foundations of the 'Resource Based View of the Firm' (RBV) to develop insights into why health organisations engage in improvement work, how impacts are conceptualised, and 'what works' in delivering these impacts. Specifically, RBV theorises that the mix and use of resources across different organisations may explain differences in performance. Whether improvement work influences these resources is unclear. Case study research will be conducted across health organisations participating in four approaches to improvement, including: a national improvement programme; a multiorganisational partnership around implementation; an organisational strategy for quality improvement; and a coproduction project designed to enhance the experience of a clinical service from the perspective of patients. Data will comprise in-depth interviews with key informants, observation of key events and documents; analysed within and then across cases. Adopting a realist perspective, the core tenets of RBV will be evaluated as a programme theory, focusing on the interplay between organisational conditions and behavioural or resource responses that are reported through engagement in improvement. The study has been approved by Bangor University Ethics Committee. The investigation will not judge the relative merits of different approaches to healthcare quality improvement. Rather, we will develop unique insights into the organisational consequences, and dependencies of quality improvement, providing an opportunity to add to the explanatory potential of RBV in this and other contexts. In addition to scientific and lay reports of the study findings, research outputs will include a framework for constructing the economic impacts of quality improvement and practical guidance for health service managers that maximises the impacts of investment in quality improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Recognising the differences in the nurse consultant role across context: a study protocol

    PubMed Central

    2014-01-01

    Background The advanced practice role of the Nurse Consultant is unique in its capacity to provide clinical leadership across a range of contexts. However, the Nurse Consultant role has been plagued with confusion due to lack of clarity over function and appropriateness for purpose within health organisations across contexts. Changing health service delivery models are driving the emergence of new nursing roles, further clouding the waters related to role positioning and purpose. There is an urgent need for evidence of impact and demonstration of how Nurse Consultants contribute to health care outcomes. This study aims to gain a clearer understanding of the Nurse Consultant role and its impact in metropolitan and rural New South Wales (NSW) Australia. Design The proposed study employs a sequential mixed method design, underpinned by Realistic Evaluation, to explore how Nurse Consultants contribute to organisational outcomes. The ‘context – mechanism – outcome’ approach of realistic evaluation provides a sound framework to examine the complex, diverse and multifaceted nature of the Nurse Consultant’s role. Method Participants will be stakeholders, recruited across a large Local Health District in NSW, comprising rural and metropolitan services. A modified and previously validated survey will be used providing information related to role characteristics, patterns and differences across health context. Focus groups with Nurse Consultant’s explore issues highlighted in the survey data. Focus groups with other clinicians, policy makers and managers will help to achieve understanding of how the role is viewed and enacted across a range of groups and contexts. Discussion Lack of role clarity is highlighted extensively in international and Australian studies examining the role of the Nurse Consultant. Previous studies failed to adequately examine the role in the context of integrated and complex health services or to examine the role in detail. Such examination is critical in order to understand the significance of the role and to ascertain how Nurse Consultants can be most effective as members of the health care team. This is the first Australian study to include extensive stakeholder perspectives in order to understand the relational and integrated nature and impact of the role across metropolitan and rural context. PMID:25320563

  9. An ice sheet model validation framework for the Greenland ice sheet

    NASA Astrophysics Data System (ADS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of < 1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  10. An ice sheet model validation framework for the Greenland ice sheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.

    We propose a new ice sheet model validation framework the Cryospheric Model Comparison Tool (CMCT) that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quanti- tative metricsmore » for use in evaluating the different model simulations against the observations. We find 10 that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, the model initial condition as well as output from idealized and dynamic models all provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CMCT, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CMCT as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.« less

  11. An ice sheet model validation framework for the Greenland ice sheet

    DOE PAGES

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; ...

    2017-01-17

    We propose a new ice sheet model validation framework the Cryospheric Model Comparison Tool (CMCT) that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quanti- tative metricsmore » for use in evaluating the different model simulations against the observations. We find 10 that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, the model initial condition as well as output from idealized and dynamic models all provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CMCT, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CMCT as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.« less

  12. An ice sheet model validation framework for the Greenland ice sheet

    PubMed Central

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2018-01-01

    We propose a new ice sheet model validation framework – the Cryospheric Model Comparison Tool (CmCt) – that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation. PMID:29697704

  13. An Ice Sheet Model Validation Framework for the Greenland Ice Sheet

    NASA Technical Reports Server (NTRS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas A.; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey R.; Chambers, Don P.; Evans, Katherine J.; hide

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of less than 1 meter). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  14. Improving Barotropic Tides by Two-way Nesting High and Low Resolution Domains

    NASA Astrophysics Data System (ADS)

    Jeon, C. H.; Buijsman, M. C.; Wallcraft, A. J.; Shriver, J. F.; Hogan, P. J.; Arbic, B. K.; Richman, J. G.

    2017-12-01

    In a realistically forced global ocean model, relatively large sea-surface-height root-mean-square (RMS) errors are observed in the North Atlantic near the Hudson Strait. These may be associated with large tidal resonances interacting with coastal bathymetry that are not correctly represented with a low resolution grid. This issue can be overcome by using high resolution grids, but at a high computational cost. In this paper we apply two-way nesting as an alternative solution. This approach applies high resolution to the area with large RMS errors and a lower resolution to the rest. It is expected to improve the tidal solution as well as reduce the computational cost. To minimize modification of the original source codes of the ocean circulation model (HYCOM), we apply the coupler OASIS3-MCT. This coupler is used to exchange barotropic pressures and velocity fields through its APIs (Application Programming Interface) between the parent and the child components. The developed two-way nesting framework has been validated with an idealized test case where the parent and the child domains have identical grid resolutions. The result of the idealized case shows very small RMS errors between the child and parent solutions. We plan to show results for a case with realistic tidal forcing in which the resolution of the child grid is three times that of the parent grid. The numerical results of this realistic case are compared to TPXO data.

  15. Exploring synergistic interactions and catalysts in complex interventions: longitudinal, mixed methods case studies of an optimised multi-level suicide prevention intervention in four european countries (Ospi-Europe).

    PubMed

    Harris, Fiona M; Maxwell, Margaret; O'Connor, Rory; Coyne, James C; Arensman, Ella; Coffey, Claire; Koburger, Nicole; Gusmão, Ricardo; Costa, Susana; Székely, András; Cserhati, Zoltan; McDaid, David; van Audenhove, Chantal; Hegerl, Ulrich

    2016-03-15

    The Medical Research Council (MRC) Framework for complex interventions highlights the need to explore interactions between components of complex interventions, but this has not yet been fully explored within complex, non-pharmacological interventions. This paper draws on the process evaluation data of a suicide prevention programme implemented in four European countries to illustrate the synergistic interactions between intervention levels in a complex programme, and to present our method for exploring these. A realist evaluation approach informed the process evaluation, which drew on mixed methods, longitudinal case studies. Data collection consisted of 47 semi-structured interviews, 12 focus groups, one workshop, fieldnoted observations of six programme meetings and 20 questionnaires (delivered at six month intervals to each of the four intervention sites). Analysis drew on the framework approach, facilitated by the use of QSR NVivo (v10). Our qualitative approach to exploring synergistic interactions (QuaSIC) also developed a matrix of hypothesised synergies that were explored within one workshop and two waves of data collection. All four implementation countries provided examples of synergistic interactions that added value beyond the sum of individual intervention levels or components in isolation. For instance, the launch ceremony of the public health campaign (a level 3 intervention) in Ireland had an impact on the community-based professional training, increasing uptake and visibility of training for journalists in particular. In turn, this led to increased media reporting of OSPI activities (monitored as part of the public health campaign) and also led to wider dissemination of editorial guidelines for responsible reporting of suicidal acts. Analysis of the total process evaluation dataset also revealed the new phenomenon of the OSPI programme acting as a catalyst for externally generated (and funded) activity that shared the goals of suicide prevention. The QuaSIC approach enabled us to develop and refine our definition of synergistic interactions and add the innovative concept of catalytic effects. This represents a novel approach to the evaluation of complex interventions. By exploring synergies and catalytic interactions related to a complex intervention or programme, we reveal the added value to planned activities and how they might be maximised.

  16. Improving Forecasts Through Realistic Uncertainty Estimates: A Novel Data Driven Method for Model Uncertainty Quantification in Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Moradkhani, H.; Marshall, L. A.; Sharma, A.; Geenens, G.

    2016-12-01

    Effective combination of model simulations and observations through Data Assimilation (DA) depends heavily on uncertainty characterisation. Many traditional methods for quantifying model uncertainty in DA require some level of subjectivity (by way of tuning parameters or by assuming Gaussian statistics). Furthermore, the focus is typically on only estimating the first and second moments. We propose a data-driven methodology to estimate the full distributional form of model uncertainty, i.e. the transition density p(xt|xt-1). All sources of uncertainty associated with the model simulations are considered collectively, without needing to devise stochastic perturbations for individual components (such as model input, parameter and structural uncertainty). A training period is used to derive the distribution of errors in observed variables conditioned on hidden states. Errors in hidden states are estimated from the conditional distribution of observed variables using non-linear optimization. The theory behind the framework and case study applications are discussed in detail. Results demonstrate improved predictions and more realistic uncertainty bounds compared to a standard perturbation approach.

  17. Nonhydrostatic icosahedral atmospheric model (NICAM) for global cloud resolving simulations

    NASA Astrophysics Data System (ADS)

    Satoh, M.; Matsuno, T.; Tomita, H.; Miura, H.; Nasuno, T.; Iga, S.

    2008-03-01

    A new type of ultra-high resolution atmospheric global circulation model is developed. The new model is designed to perform "cloud resolving simulations" by directly calculating deep convection and meso-scale circulations, which play key roles not only in the tropical circulations but in the global circulations of the atmosphere. Since cores of deep convection have a few km in horizontal size, they have not directly been resolved by existing atmospheric general circulation models (AGCMs). In order to drastically enhance horizontal resolution, a new framework of a global atmospheric model is required; we adopted nonhydrostatic governing equations and icosahedral grids to the new model, and call it Nonhydrostatic ICosahedral Atmospheric Model (NICAM). In this article, we review governing equations and numerical techniques employed, and present the results from the unique 3.5-km mesh global experiments—with O(10 9) computational nodes—using realistic topography and land/ocean surface thermal forcing. The results show realistic behaviors of multi-scale convective systems in the tropics, which have not been captured by AGCMs. We also argue future perspective of the roles of the new model in the next generation atmospheric sciences.

  18. On the usefulness of the concept of presence in virtual reality applications

    NASA Astrophysics Data System (ADS)

    Mestre, Daniel R.

    2015-03-01

    Virtual Reality (VR) leads to realistic experimental situations, while enabling researchers to have deterministic control on these situations, and to precisely measure participants' behavior. However, because more realistic and complex situations can be implemented, important questions arise, concerning the validity and representativeness of the observed behavior, with reference to a real situation. One example is the investigation of a critical (virtually dangerous) situation, in which the participant knows that no actual threat is present in the simulated situation, and might thus exhibit a behavioral response that is far from reality. This poses serious problems, for instance in training situations, in terms of transfer of learning to a real situation. Facing this difficult question, it seems necessary to study the relationships between three factors: immersion (physical realism), presence (psychological realism) and behavior. We propose a conceptual framework, in which presence is a necessary condition for the emergence of a behavior that is representative of what is observed in real conditions. Presence itself depends not only on physical immersive characteristics of the Virtual Reality setup, but also on contextual and psychological factors.

  19. Differentiability of correlations in realistic quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabrera, Alejandro; Faria, Edson de; Pujals, Enrique

    2015-09-15

    We prove a version of Bell’s theorem in which the locality assumption is weakened. We start by assuming theoretical quantum mechanics and weak forms of relativistic causality and of realism (essentially the fact that observable values are well defined independently of whether or not they are measured). Under these hypotheses, we show that only one of the correlation functions that can be formulated in the framework of the usual Bell theorem is unknown. We prove that this unknown function must be differentiable at certain angular configuration points that include the origin. We also prove that, if this correlation is assumedmore » to be twice differentiable at the origin, then we arrive at a version of Bell’s theorem. On the one hand, we are showing that any realistic theory of quantum mechanics which incorporates the kinematic aspects of relativity must lead to this type of rough correlation function that is once but not twice differentiable. On the other hand, this study brings us a single degree of differentiability away from a relativistic von Neumann no hidden variables theorem.« less

  20. Ontological realism: A methodology for coordinated evolution of scientific ontologies.

    PubMed

    Smith, Barry; Ceusters, Werner

    2010-11-15

    Since 2002 we have been testing and refining a methodology for ontology development that is now being used by multiple groups of researchers in different life science domains. Gary Merrill, in a recent paper in this journal, describes some of the reasons why this methodology has been found attractive by researchers in the biological and biomedical sciences. At the same time he assails the methodology on philosophical grounds, focusing specifically on our recommendation that ontologies developed for scientific purposes should be constructed in such a way that their terms are seen as referring to what we call universals or types in reality. As we show, Merrill's critique is of little relevance to the success of our realist project, since it not only reveals no actual errors in our work but also criticizes views on universals that we do not in fact hold. However, it nonetheless provides us with a valuable opportunity to clarify the realist methodology, and to show how some of its principles are being applied, especially within the framework of the OBO (Open Biomedical Ontologies) Foundry initiative.

Top