Sample records for framework approach results

  1. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  2. Defining an integrative approach for health promotion and disease prevention: A population health equity framework

    PubMed Central

    Trinh-Shevrin, Chau; Nadkarni, Smiti; Park, Rebecca; Islam, Nadia; Kwon, Simona C.

    2015-01-01

    Background Eliminating health disparities in racial ethnic minority and underserved populations requires a paradigm shift from disease-focused biomedical approaches to a health equity framework that aims to achieve optimal health for all by targeting social and structural determinants of health. Methods We describe the concepts and parallel approaches that underpin an integrative population health equity framework. Using a case study approach we present the experience of the NYU Center for the Study of Asian American Health (CSAAH) in applying the framework to guide its work. Results This framework is central to CSAAH’s efforts moving towards a population health equity vision for Asian Americans. Discussion Advancing the health of underserved populations requires community engagement and an understanding of the multilevel contextual factors that influence health. Applying an integrative framework has allowed us to advance health equity for Asian American communities and may serve as a useful framework for other underserved populations. PMID:25981095

  3. The dimensional salience solution to the expectancy-value muddle: an extension.

    PubMed

    Newton, Joshua D; Newton, Fiona J; Ewing, Michael T

    2014-01-01

    The theory of reasoned action (TRA) specifies a set of expectancy-value, belief-based frameworks that underpin attitude (behavioural beliefs × outcome evaluations) and subjective norm (normative beliefs × motivation to comply). Unfortunately, the most common method for analysing these frameworks generates statistically uninterpretable findings, resulting in what has been termed the 'expectancy-value muddle'. Recently, however, a dimensional salience approach was found to resolve this muddle for the belief-based framework underpinning attitude. An online survey of 262 participants was therefore conducted to determine whether the dimensional salience approach could also be applied to the belief-based framework underpinning subjective norm. Results revealed that motivations to comply were greater for salient, as opposed to non-salient, social referents. The belief-based framework underpinning subjective norm was therefore represented by evaluating normative belief ratings for salient social referents. This modified framework was found to predict subjective norm, although predictions were greater when participants were forced to select five salient social referents rather than being free to select any number of social referents. These findings validate the use of the dimensional salience approach for examining the belief-based frameworks underpinning subjective norm. As such, this approach provides a complete solution to addressing the expectancy-value muddle in the TRA.

  4. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  5. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  6. A Multi-Agent Framework for Packet Routing in Wireless Sensor Networks

    PubMed Central

    Ye, Dayon; Zhang, Minji; Yang, Yu

    2015-01-01

    Wireless sensor networks (WSNs) have been widely investigated in recent years. One of the fundamental issues in WSNs is packet routing, because in many application domains, packets have to be routed from source nodes to destination nodes as soon and as energy efficiently as possible. To address this issue, a large number of routing approaches have been proposed. Although every existing routing approach has advantages, they also have some disadvantages. In this paper, a multi-agent framework is proposed that can assist existing routing approaches to improve their routing performance. This framework enables each sensor node to build a cooperative neighbour set based on past routing experience. Such cooperative neighbours, in turn, can help the sensor to effectively relay packets in the future. This framework is independent of existing routing approaches and can be used to assist many existing routing approaches. Simulation results demonstrate the good performance of this framework in terms of four metrics: average delivery latency, successful delivery ratio, number of live nodes and total sensing coverage. PMID:25928063

  7. Framework for Considering Productive Aging and Work.

    PubMed

    Schulte, Paul A; Grosch, James; Scholl, Juliann C; Tamers, Sara L

    2018-05-01

    The U.S. population is experiencing a demographic transition resulting in an aging workforce. The objective of this article is to elucidate and expand an approach to keep that workforce safe, healthy, and productive. This article elucidates the framework for the National Center for Productive Aging at Work of the National Institute for Occupational Safety and Health. Subject matter experts used a snowball method to review published literature to substantiate elements in the framework. Evidence-based literature supports a productive aging framework for the workforce involving the following elements: 1) life span perspective; 2) comprehensive and integrated approaches to occupational safety and health; 3) emphasis on positive outcomes for both workers and organizations; and 4) supportive work culture for multigenerational issues. The productive aging framework provides a foundational and comprehensive approach for addressing the aging workforce.

  8. An iterative consensus-building approach to revising a genetics/genomics competency framework for nurse education in the UK

    PubMed Central

    Kirk, Maggie; Tonkin, Emma; Skirton, Heather

    2014-01-01

    KIRK M., TONKIN E. & SKIRTON H. (2014) An iterative consensus-building approach to revising a genetics/genomics competency framework for nurse education in the UK. Journal of Advanced Nursing 70(2), 405–420. doi: 10.1111/jan.12207 AimTo report a review of a genetics education framework using a consensus approach to agree on a contemporary and comprehensive revised framework. BackgroundAdvances in genomic health care have been significant since the first genetics education framework for nurses was developed in 2003. These, coupled with developments in policy and international efforts to promote nursing competence in genetics, indicated that review was timely. DesignA structured, iterative, primarily qualitative approach, based on a nominal group technique. MethodA meeting convened in 2010 involved stakeholders in UK nursing education, practice and management, including patient representatives (n = 30). A consensus approach was used to solicit participants' views on the individual/family needs identified from real-life stories of people affected by genetic conditions and the nurses' knowledge, skills and attitudes needed to meet those needs. Five groups considered the stories in iterative rounds, reviewing comments from previous groups. Omissions and deficiencies were identified by mapping resulting themes to the original framework. Anonymous voting captured views. Educators at a second meeting developed learning outcomes for the final framework. FindingsDeficiencies in relation to Advocacy, Information management and Ongoing care were identified. All competencies of the original framework were revised, adding an eighth competency to make explicit the need for ongoing care of the individual/family. ConclusionModifications to the framework reflect individual/family needs and are relevant to the nursing role. The approach promoted engagement in a complex issue and provides a framework to guide nurse education in genetics/genomics; however, nursing leadership is crucial to successful implementation. PMID:23879662

  9. A Holistic Framework to Improve the Uptake and Impact of eHealth Technologies

    PubMed Central

    van Limburg, Maarten; Ossebaard, Hans C; Kelders, Saskia M; Eysenbach, Gunther; Seydel, Erwin R

    2011-01-01

    Background Many eHealth technologies are not successful in realizing sustainable innovations in health care practices. One of the reasons for this is that the current development of eHealth technology often disregards the interdependencies between technology, human characteristics, and the socioeconomic environment, resulting in technology that has a low impact in health care practices. To overcome the hurdles with eHealth design and implementation, a new, holistic approach to the development of eHealth technologies is needed, one that takes into account the complexity of health care and the rituals and habits of patients and other stakeholders. Objective The aim of this viewpoint paper is to improve the uptake and impact of eHealth technologies by advocating a holistic approach toward their development and eventual integration in the health sector. Methods To identify the potential and limitations of current eHealth frameworks (1999–2009), we carried out a literature search in the following electronic databases: PubMed, ScienceDirect, Web of Knowledge, PiCarta, and Google Scholar. Of the 60 papers that were identified, 44 were selected for full review. We excluded those papers that did not describe hands-on guidelines or quality criteria for the design, implementation, and evaluation of eHealth technologies (28 papers). From the results retrieved, we identified 16 eHealth frameworks that matched the inclusion criteria. The outcomes were used to posit strategies and principles for a holistic approach toward the development of eHealth technologies; these principles underpin our holistic eHealth framework. Results A total of 16 frameworks qualified for a final analysis, based on their theoretical backgrounds and visions on eHealth, and the strategies and conditions for the research and development of eHealth technologies. Despite their potential, the relationship between the visions on eHealth, proposed strategies, and research methods is obscure, perhaps due to a rather conceptual approach that focuses on the rationale behind the frameworks rather than on practical guidelines. In addition, the Web 2.0 technologies that call for a more stakeholder-driven approach are beyond the scope of current frameworks. To overcome these limitations, we composed a holistic framework based on a participatory development approach, persuasive design techniques, and business modeling. Conclusions To demonstrate the impact of eHealth technologies more effectively, a fresh way of thinking is required about how technology can be used to innovate health care. It also requires new concepts and instruments to develop and implement technologies in practice. The proposed framework serves as an evidence-based roadmap. PMID:22155738

  10. An iterative consensus-building approach to revising a genetics/genomics competency framework for nurse education in the UK.

    PubMed

    Kirk, Maggie; Tonkin, Emma; Skirton, Heather

    2014-02-01

    To report a review of a genetics education framework using a consensus approach to agree on a contemporary and comprehensive revised framework. Advances in genomic health care have been significant since the first genetics education framework for nurses was developed in 2003. These, coupled with developments in policy and international efforts to promote nursing competence in genetics, indicated that review was timely. A structured, iterative, primarily qualitative approach, based on a nominal group technique. A meeting convened in 2010 involved stakeholders in UK nursing education, practice and management, including patient representatives (n = 30). A consensus approach was used to solicit participants' views on the individual/family needs identified from real-life stories of people affected by genetic conditions and the nurses' knowledge, skills and attitudes needed to meet those needs. Five groups considered the stories in iterative rounds, reviewing comments from previous groups. Omissions and deficiencies were identified by mapping resulting themes to the original framework. Anonymous voting captured views. Educators at a second meeting developed learning outcomes for the final framework. Deficiencies in relation to Advocacy, Information management and Ongoing care were identified. All competencies of the original framework were revised, adding an eighth competency to make explicit the need for ongoing care of the individual/family. Modifications to the framework reflect individual/family needs and are relevant to the nursing role. The approach promoted engagement in a complex issue and provides a framework to guide nurse education in genetics/genomics; however, nursing leadership is crucial to successful implementation. © 2013 The Authors. Journal of Advanced Nursing published by John Wiley & Sons Ltd.

  11. Evidence-Based Leadership Development: The 4L Framework

    ERIC Educational Resources Information Center

    Scott, Shelleyann; Webber, Charles F.

    2008-01-01

    Purpose: This paper aims to use the results of three research initiatives to present the life-long learning leader 4L framework, a model for leadership development intended for use by designers and providers of leadership development programming. Design/methodology/approach: The 4L model is a conceptual framework that emerged from the analysis of…

  12. Toward a bioethical framework for antibiotic use, antimicrobial resistance and for empirically designing ethically robust strategies to protect human health: a research protocol

    PubMed Central

    Martins Pereira, Sandra; de Sá Brandão, Patrícia Joana; Araújo, Joana; Carvalho, Ana Sofia

    2017-01-01

    Introduction Antimicrobial resistance (AMR) is a challenging global and public health issue, raising bioethical challenges, considerations and strategies. Objectives This research protocol presents a conceptual model leading to formulating an empirically based bioethics framework for antibiotic use, AMR and designing ethically robust strategies to protect human health. Methods Mixed methods research will be used and operationalized into five substudies. The bioethical framework will encompass and integrate two theoretical models: global bioethics and ethical decision-making. Results Being a study protocol, this article reports on planned and ongoing research. Conclusions Based on data collection, future findings and using a comprehensive, integrative, evidence-based approach, a step-by-step bioethical framework will be developed for (i) responsible use of antibiotics in healthcare and (ii) design of strategies to decrease AMR. This will entail the analysis and interpretation of approaches from several bioethical theories, including deontological and consequentialist approaches, and the implications of uncertainty to these approaches. PMID:28459355

  13. Testing the suitability of geologic frameworks for extrapolating hydraulic properties across regional scales

    DOE PAGES

    Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald; ...

    2016-02-18

    The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks providemore » the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. As a result, testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.« less

  14. Testing the suitability of geologic frameworks for extrapolating hydraulic properties across regional scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald

    The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks providemore » the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. As a result, testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.« less

  15. A framework for learning and planning against switching strategies in repeated games

    NASA Astrophysics Data System (ADS)

    Hernandez-Leal, Pablo; Munoz de Cote, Enrique; Sucar, L. Enrique

    2014-04-01

    Intelligent agents, human or artificial, often change their behaviour as they interact with other agents. For an agent to optimise its performance when interacting with such agents, it must be capable of detecting and adapting according to such changes. This work presents an approach on how to effectively deal with non-stationary switching opponents in a repeated game context. Our main contribution is a framework for online learning and planning against opponents that switch strategies. We present how two opponent modelling techniques work within the framework and prove the usefulness of the approach experimentally in the iterated prisoner's dilemma, when the opponent is modelled as an agent that switches between different strategies (e.g. TFT, Pavlov and Bully). The results of both models were compared against each other and against a state-of-the-art non-stationary reinforcement learning technique. Results reflect that our approach obtains competitive results without needing an offline training phase, as opposed to the state-of-the-art techniques.

  16. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting

    PubMed Central

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen; Wald, Lawrence L.

    2017-01-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization. PMID:26915119

  17. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    PubMed

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  18. A hybrid framework of first principles molecular orbital calculations and a three-dimensional integral equation theory for molecular liquids: Multi-center molecular Ornstein-Zernike self-consistent field approach

    NASA Astrophysics Data System (ADS)

    Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi

    2015-07-01

    In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl- + CH3Cl → ClCH3 + Cl-) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.

  19. A hybrid framework of first principles molecular orbital calculations and a three-dimensional integral equation theory for molecular liquids: multi-center molecular Ornstein-Zernike self-consistent field approach.

    PubMed

    Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi

    2015-07-07

    In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl(-) + CH3Cl → ClCH3 + Cl(-)) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.

  20. Theory of Change: a theory-driven approach to enhance the Medical Research Council's framework for complex interventions

    PubMed Central

    2014-01-01

    Background The Medical Research Councils’ framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. Methods We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Results Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Conclusions Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions, thereby increasing the likelihood that the intervention will be ultimately effective, sustainable and scalable. We urge researchers developing and evaluating complex interventions to consider using this approach, to evaluate its usefulness and to build an evidence base to further refine the methodology. Trial registration Clinical trials.gov: NCT02160249 PMID:24996765

  1. Awareness, adoption, and application of the Association of College & Research Libraries (ACRL) Framework for Information Literacy in health sciences libraries*

    PubMed Central

    Schulte, Stephanie J.; Knapp, Maureen

    2017-01-01

    Objective: In early 2016, the Association of College & Research Libraries (ACRL) officially adopted a conceptual Framework for Information Literacy (Framework) that was a significant shift away from the previous standards-based approach. This study sought to determine (1) if health sciences librarians are aware of the recent Framework for Information Literacy; (2) if they have used the Framework to change their instruction or communication with faculty, and if so, what changes have taken place; and (3) if certain librarian characteristics are associated with the likelihood of adopting the Framework. Methods: This study utilized a descriptive electronic survey. Results: Half of all respondents were aware of and were using or had plans to use the Framework. Academic health sciences librarians and general academic librarians were more likely than hospital librarians to be aware of the Framework. Those using the Framework were mostly revising and creating content, revising their teaching approach, and learning more about the Framework. Framework users commented that it was influencing how they thought about and discussed information literacy with faculty and students. Most hospital librarians and half the academic health sciences librarians were not using and had no plans to use the Framework. Librarians with more than twenty years of experience were less likely to be aware of the Framework and more likely to have no plans to use it. Common reasons for not using the Framework were lack of awareness of a new version and lack of involvement in formal instruction. Conclusion: The results suggest that there is room to improve awareness and application of the Framework among health sciences librarians. PMID:28983198

  2. Design and synthesis of polyoxometalate-framework materials from cluster precursors

    NASA Astrophysics Data System (ADS)

    Vilà-Nadal, Laia; Cronin, Leroy

    2017-10-01

    Inorganic oxide materials are used in semiconductor electronics, ion exchange, catalysis, coatings, gas sensors and as separation materials. Although their synthesis is well understood, the scope for new materials is reduced because of the stability limits imposed by high-temperature processing and top-down synthetic approaches. In this Review, we describe the derivatization of polyoxometalate (POM) clusters, which enables their assembly into a range of frameworks by use of organic or inorganic linkers. Additionally, bottom-up synthetic approaches can be used to make metal oxide framework materials, and the features of the molecular POM precursors are retained in these structures. Highly robust all-inorganic frameworks can be made using metal-ion linkers, which combine molecular synthetic control without the need for organic components. The resulting frameworks have high stability, and high catalytic, photochemical and electrochemical activity. Conceptually, these inorganic oxide materials bridge the gap between zeolites and metal-organic frameworks (MOFs) and establish a new class of all-inorganic POM frameworks that can be designed using topological and reactivity principles similar to MOFs.

  3. ICADx: interpretable computer aided diagnosis of breast masses

    NASA Astrophysics Data System (ADS)

    Kim, Seong Tae; Lee, Hakmin; Kim, Hak Gu; Ro, Yong Man

    2018-02-01

    In this study, a novel computer aided diagnosis (CADx) framework is devised to investigate interpretability for classifying breast masses. Recently, a deep learning technology has been successfully applied to medical image analysis including CADx. Existing deep learning based CADx approaches, however, have a limitation in explaining the diagnostic decision. In real clinical practice, clinical decisions could be made with reasonable explanation. So current deep learning approaches in CADx are limited in real world deployment. In this paper, we investigate interpretability in CADx with the proposed interpretable CADx (ICADx) framework. The proposed framework is devised with a generative adversarial network, which consists of interpretable diagnosis network and synthetic lesion generative network to learn the relationship between malignancy and a standardized description (BI-RADS). The lesion generative network and the interpretable diagnosis network compete in an adversarial learning so that the two networks are improved. The effectiveness of the proposed method was validated on public mammogram database. Experimental results showed that the proposed ICADx framework could provide the interpretability of mass as well as mass classification. It was mainly attributed to the fact that the proposed method was effectively trained to find the relationship between malignancy and interpretations via the adversarial learning. These results imply that the proposed ICADx framework could be a promising approach to develop the CADx system.

  4. A Health Systems Approach to Integrated Community Case Management of Childhood Illness: Methods and Tools

    PubMed Central

    McGorman, Laura; Marsh, David R.; Guenther, Tanya; Gilroy, Kate; Barat, Lawrence M.; Hammamy, Diaa; Wansi, Emmanuel; Peterson, Stefan; Hamer, Davidson H.; George, Asha

    2012-01-01

    Integrated community case management (iCCM) of childhood illness is an increasingly popular strategy to expand life-saving health services to underserved communities. However, community health approaches vary widely across countries and do not always distribute resources evenly across local health systems. We present a harmonized framework, developed through interagency consultation and review, which supports the design of CCM by using a systems approach. To verify that the framework produces results, we also suggest a list of complementary indicators, including nine global metrics, and a menu of 39 country-specific measures. When used by program managers and evaluators, we propose that the framework and indicators can facilitate the design, implementation, and evaluation of community case management. PMID:23136280

  5. Strengths, Opportunities, Aspirations, and Results: An Emerging Approach to Organization Development

    ERIC Educational Resources Information Center

    Zarestky, Jill; Cole, Catherine S.

    2017-01-01

    Organization development (OD) interventions have typically relied on the strengths, weaknesses, opportunities, and threats (SWOT) framework for strategic planning. The strengths, opportunities, aspirations, and results (SOAR) framework is a relatively new innovation in OD that may serve as a viable alternative to SWOT for those who wish to apply…

  6. An Approach to Information Management for AIR7000 with Metadata and Ontologies

    DTIC Science & Technology

    2009-10-01

    metadata. We then propose an approach based on Semantic Technologies including the Resource Description Framework (RDF) and Upper Ontologies, for the...mandating specific metadata schemas can result in interoperability problems. For example, many standards within the ADO mandate the use of XML for metadata...such problems, we propose an archi- tecture in which different metadata schemes can inter operate. By using RDF (Resource Description Framework ) as a

  7. Vulnerable Populations in Hospital and Health Care Emergency Preparedness Planning: A Comprehensive Framework for Inclusion.

    PubMed

    Kreisberg, Debra; Thomas, Deborah S K; Valley, Morgan; Newell, Shannon; Janes, Enessa; Little, Charles

    2016-04-01

    As attention to emergency preparedness becomes a critical element of health care facility operations planning, efforts to recognize and integrate the needs of vulnerable populations in a comprehensive manner have lagged. This not only results in decreased levels of equitable service, but also affects the functioning of the health care system in disasters. While this report emphasizes the United States context, the concepts and approaches apply beyond this setting. This report: (1) describes a conceptual framework that provides a model for the inclusion of vulnerable populations into integrated health care and public health preparedness; and (2) applies this model to a pilot study. The framework is derived from literature, hospital regulatory policy, and health care standards, laying out the communication and relational interfaces that must occur at the systems, organizational, and community levels for a successful multi-level health care systems response that is inclusive of diverse populations explicitly. The pilot study illustrates the application of key elements of the framework, using a four-pronged approach that incorporates both quantitative and qualitative methods for deriving information that can inform hospital and health facility preparedness planning. The conceptual framework and model, applied to a pilot project, guide expanded work that ultimately can result in methodologically robust approaches to comprehensively incorporating vulnerable populations into the fabric of hospital disaster preparedness at levels from local to national, thus supporting best practices for a community resilience approach to disaster preparedness.

  8. Climate change adaptation frameworks: an evaluation of plans for coastal Suffolk, UK

    NASA Astrophysics Data System (ADS)

    Armstrong, J.; Wilby, R.; Nicholls, R. J.

    2015-11-01

    This paper asserts that three principal frameworks for climate change adaptation can be recognised in the literature: scenario-led (SL), vulnerability-led (VL) and decision-centric (DC) frameworks. A criterion is developed to differentiate these frameworks in recent adaptation projects. The criterion features six key hallmarks as follows: (1) use of climate model information; (2) analysis of metrics/units; (3) socio-economic knowledge; (4) stakeholder engagement; (5) adaptation of implementation mechanisms; (6) tier of adaptation implementation. The paper then tests the validity of this approach using adaptation projects on the Suffolk coast, UK. Fourteen adaptation plans were identified in an online survey. They were analysed in relation to the hallmarks outlined above and assigned to an adaptation framework. The results show that while some adaptation plans are primarily SL, VL or DC, the majority are hybrid, showing a mixture of DC/VL and DC/SL characteristics. Interestingly, the SL/VL combination is not observed, perhaps because the DC framework is intermediate and attempts to overcome weaknesses of both SL and VL approaches. The majority (57 %) of adaptation projects generated a risk assessment or advice notes. Further development of this type of framework analysis would allow better guidance on approaches for organisations when implementing climate change adaptation initiatives, and other similar proactive long-term planning.

  9. Climate change adaptation frameworks: an evaluation of plans for coastal, Suffolk, UK

    NASA Astrophysics Data System (ADS)

    Armstrong, J.; Wilby, R.; Nicholls, R. J.

    2015-06-01

    This paper asserts that three principal frameworks for climate change adaptation can be recognised in the literature: Scenario-Led (SL), Vulnerability-Led (VL) and Decision-Centric (DC) frameworks. A criterion is developed to differentiate these frameworks in recent adaptation projects. The criterion features six key hallmarks as follows: (1) use of climate model information; (2) analysis metrics/units; (3) socio-economic knowledge; (4) stakeholder engagement; (5) adaptation implementation mechanisms; (6) tier of adaptation implementation. The paper then tests the validity of this approach using adaptation projects on the Suffolk coast, UK. Fourteen adaptation plans were identified in an online survey. They were analysed in relation to the hallmarks outlined above and assigned to an adaptation framework. The results show that while some adaptation plans are primarily SL, VL or DC, the majority are hybrid showing a mixture of DC/VL and DC/SL characteristics. Interestingly, the SL/VL combination is not observed, perhaps because the DC framework is intermediate and attempts to overcome weaknesses of both SL and VL approaches. The majority (57 %) of adaptation projects generated a risk assessment or advice notes. Further development of this type of framework analysis would allow better guidance on approaches for organisations when implementing climate change adaptation initiatives, and other similar proactive long-term planning.

  10. A Bayesian framework to estimate diversification rates and their variation through time and space

    PubMed Central

    2011-01-01

    Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae) and Lupinus (Fabaceae). In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling. PMID:22013891

  11. A Stochastic Simulation Framework for the Prediction of Strategic Noise Mapping and Occupational Noise Exposure Using the Random Walk Approach

    PubMed Central

    Haron, Zaiton; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019

  12. A holistic framework to improve the uptake and impact of eHealth technologies.

    PubMed

    van Gemert-Pijnen, Julia E W C; Nijland, Nicol; van Limburg, Maarten; Ossebaard, Hans C; Kelders, Saskia M; Eysenbach, Gunther; Seydel, Erwin R

    2011-12-05

    Many eHealth technologies are not successful in realizing sustainable innovations in health care practices. One of the reasons for this is that the current development of eHealth technology often disregards the interdependencies between technology, human characteristics, and the socioeconomic environment, resulting in technology that has a low impact in health care practices. To overcome the hurdles with eHealth design and implementation, a new, holistic approach to the development of eHealth technologies is needed, one that takes into account the complexity of health care and the rituals and habits of patients and other stakeholders. The aim of this viewpoint paper is to improve the uptake and impact of eHealth technologies by advocating a holistic approach toward their development and eventual integration in the health sector. To identify the potential and limitations of current eHealth frameworks (1999-2009), we carried out a literature search in the following electronic databases: PubMed, ScienceDirect, Web of Knowledge, PiCarta, and Google Scholar. Of the 60 papers that were identified, 44 were selected for full review. We excluded those papers that did not describe hands-on guidelines or quality criteria for the design, implementation, and evaluation of eHealth technologies (28 papers). From the results retrieved, we identified 16 eHealth frameworks that matched the inclusion criteria. The outcomes were used to posit strategies and principles for a holistic approach toward the development of eHealth technologies; these principles underpin our holistic eHealth framework. A total of 16 frameworks qualified for a final analysis, based on their theoretical backgrounds and visions on eHealth, and the strategies and conditions for the research and development of eHealth technologies. Despite their potential, the relationship between the visions on eHealth, proposed strategies, and research methods is obscure, perhaps due to a rather conceptual approach that focuses on the rationale behind the frameworks rather than on practical guidelines. In addition, the Web 2.0 technologies that call for a more stakeholder-driven approach are beyond the scope of current frameworks. To overcome these limitations, we composed a holistic framework based on a participatory development approach, persuasive design techniques, and business modeling. To demonstrate the impact of eHealth technologies more effectively, a fresh way of thinking is required about how technology can be used to innovate health care. It also requires new concepts and instruments to develop and implement technologies in practice. The proposed framework serves as an evidence-based roadmap.

  13. General Theory of Absorption in Porous Materials: Restricted Multilayer Theory.

    PubMed

    Aduenko, Alexander A; Murray, Andy; Mendoza-Cortes, Jose L

    2018-04-18

    In this article, we present an approach for the generalization of adsorption of light gases in porous materials. This new theory goes beyond Langmuir and Brunauer-Emmett-Teller theories, which are the standard approaches that have a limited application to crystalline porous materials by their unphysical assumptions on the amount of possible adsorption layers. The derivation of a more general equation for any crystalline porous framework is presented, restricted multilayer theory. Our approach allows the determination of gas uptake considering only geometrical constraints of the porous framework and the interaction energy of the guest molecule with the framework. On the basis of this theory, we calculated optimal values for the adsorption enthalpy at different temperatures and pressures. We also present the use of this theory to determine the optimal linker length for a topologically equivalent framework series. We validate this theoretical approach by applying it to metal-organic frameworks (MOFs) and show that it reproduces the experimental results for seven different reported materials. We obtained the universal equation for the optimal linker length, given the topology of a porous framework. This work applied the general equation to MOFs and H 2 to create energy-storage materials; however, this theory can be applied to other crystalline porous materials and light gases, which opens the possibility of designing the next generations of energy-storage materials by first considering only the geometrical constraints of the porous materials.

  14. Adaptive Framework for Classification and Novel Class Detection over Evolving Data Streams with Limited Labeled Data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haque, Ahsanul; Khan, Latifur; Baron, Michael

    2015-09-01

    Most approaches to classifying evolving data streams either divide the stream of data into fixed-size chunks or use gradual forgetting to address the problems of infinite length and concept drift. Finding the fixed size of the chunks or choosing a forgetting rate without prior knowledge about time-scale of change is not a trivial task. As a result, these approaches suffer from a trade-off between performance and sensitivity. To address this problem, we present a framework which uses change detection techniques on the classifier performance to determine chunk boundaries dynamically. Though this framework exhibits good performance, it is heavily dependent onmore » the availability of true labels of data instances. However, labeled data instances are scarce in realistic settings and not readily available. Therefore, we present a second framework which is unsupervised in nature, and exploits change detection on classifier confidence values to determine chunk boundaries dynamically. In this way, it avoids the use of labeled data while still addressing the problems of infinite length and concept drift. Moreover, both of our proposed frameworks address the concept evolution problem by detecting outliers having similar values for the attributes. We provide theoretical proof that our change detection method works better than other state-of-the-art approaches in this particular scenario. Results from experiments on various benchmark and synthetic data sets also show the efficiency of our proposed frameworks.« less

  15. A KPI-based process monitoring and fault detection framework for large-scale processes.

    PubMed

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  16. A super resolution framework for low resolution document image OCR

    NASA Astrophysics Data System (ADS)

    Ma, Di; Agam, Gady

    2013-01-01

    Optical character recognition is widely used for converting document images into digital media. Existing OCR algorithms and tools produce good results from high resolution, good quality, document images. In this paper, we propose a machine learning based super resolution framework for low resolution document image OCR. Two main techniques are used in our proposed approach: a document page segmentation algorithm and a modified K-means clustering algorithm. Using this approach, by exploiting coherence in the document, we reconstruct from a low resolution document image a better resolution image and improve OCR results. Experimental results show substantial gain in low resolution documents such as the ones captured from video.

  17. Comparison of hand-craft feature based SVM and CNN based deep learning framework for automatic polyp classification.

    PubMed

    Younghak Shin; Balasingham, Ilangko

    2017-07-01

    Colonoscopy is a standard method for screening polyps by highly trained physicians. Miss-detected polyps in colonoscopy are potential risk factor for colorectal cancer. In this study, we investigate an automatic polyp classification framework. We aim to compare two different approaches named hand-craft feature method and convolutional neural network (CNN) based deep learning method. Combined shape and color features are used for hand craft feature extraction and support vector machine (SVM) method is adopted for classification. For CNN approach, three convolution and pooling based deep learning framework is used for classification purpose. The proposed framework is evaluated using three public polyp databases. From the experimental results, we have shown that the CNN based deep learning framework shows better classification performance than the hand-craft feature based methods. It achieves over 90% of classification accuracy, sensitivity, specificity and precision.

  18. Assessing the economic benefits of vaccines based on the health investment life course framework: a review of a broader approach to evaluate malaria vaccination.

    PubMed

    Constenla, Dagna

    2015-03-24

    Economic evaluations have routinely understated the net benefits of vaccination by not including the full range of economic benefits that accrue over the lifetime of a vaccinated person. Broader approaches for evaluating benefits of vaccination can be used to more accurately calculate the value of vaccination. This paper reflects on the methodology of one such approach - the health investment life course approach - that looks at the impact of vaccine investment on lifetime returns. The role of this approach on vaccine decision-making will be assessed using the malaria health investment life course model example. We describe a framework that measures the impact of a health policy decision on government accounts over many generations. The methodological issues emerging from this approach are illustrated with an example from a recently completed health investment life course analysis of malaria vaccination in Ghana. Beyond the results, various conceptual and practical challenges of applying this framework to Ghana are discussed in this paper. The current framework seeks to understand how disease and available technologies can impact a range of economic parameters such as labour force participation, education, healthcare consumption, productivity, wages or economic growth, and taxation following their introduction. The framework is unique amongst previous economic models in malaria because it considers future tax revenue for governments. The framework is complementary to cost-effectiveness and budget impact analysis. The intent of this paper is to stimulate discussion on how existing and new methodology can add to knowledge regarding the benefits from investing in new and underutilized vaccines. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. A Standards-Based Approach for Reporting Assessment Results in South Africa

    ERIC Educational Resources Information Center

    Kanjee, Anil; Moloi, Qetelo

    2016-01-01

    This article proposes the use of a standards-based approach to reporting results from large-scale assessment surveys in South Africa. The use of this approach is intended to address the key shortcomings observed in the current reporting framework prescribed in the national curriculum documents. Using the Angoff method and data from the Annual…

  20. Fuzzy entropy thresholding and multi-scale morphological approach for microscopic image enhancement

    NASA Astrophysics Data System (ADS)

    Zhou, Jiancan; Li, Yuexiang; Shen, Linlin

    2017-07-01

    Microscopic images provide lots of useful information for modern diagnosis and biological research. However, due to the unstable lighting condition during image capturing, two main problems, i.e., high-level noises and low image contrast, occurred in the generated cell images. In this paper, a simple but efficient enhancement framework is proposed to address the problems. The framework removes image noises using a hybrid method based on wavelet transform and fuzzy-entropy, and enhances the image contrast with an adaptive morphological approach. Experiments on real cell dataset were made to assess the performance of proposed framework. The experimental results demonstrate that our proposed enhancement framework increases the cell tracking accuracy to an average of 74.49%, which outperforms the benchmark algorithm, i.e., 46.18%.

  1. HIGHLIGHTING DIFFERENCES BETWEEN CONDITIONAL AND UNCONDITIONAL QUANTILE REGRESSION APPROACHES THROUGH AN APPLICATION TO ASSESS MEDICATION ADHERENCE

    PubMed Central

    BORAH, BIJAN J.; BASU, ANIRBAN

    2014-01-01

    The quantile regression (QR) framework provides a pragmatic approach in understanding the differential impacts of covariates along the distribution of an outcome. However, the QR framework that has pervaded the applied economics literature is based on the conditional quantile regression method. It is used to assess the impact of a covariate on a quantile of the outcome conditional on specific values of other covariates. In most cases, conditional quantile regression may generate results that are often not generalizable or interpretable in a policy or population context. In contrast, the unconditional quantile regression method provides more interpretable results as it marginalizes the effect over the distributions of other covariates in the model. In this paper, the differences between these two regression frameworks are highlighted, both conceptually and econometrically. Additionally, using real-world claims data from a large US health insurer, alternative QR frameworks are implemented to assess the differential impacts of covariates along the distribution of medication adherence among elderly patients with Alzheimer’s disease. PMID:23616446

  2. When the Mannequin Dies, Creation and Exploration of a Theoretical Framework Using a Mixed Methods Approach.

    PubMed

    Tripathy, Shreepada; Miller, Karen H; Berkenbosch, John W; McKinley, Tara F; Boland, Kimberly A; Brown, Seth A; Calhoun, Aaron W

    2016-06-01

    Controversy exists in the simulation community as to the emotional and educational ramifications of mannequin death due to learner action or inaction. No theoretical framework to guide future investigations of learner actions currently exists. The purpose of our study was to generate a model of the learner experience of mannequin death using a mixed methods approach. The study consisted of an initial focus group phase composed of 11 learners who had previously experienced mannequin death due to action or inaction on the part of learners as defined by Leighton (Clin Simul Nurs. 2009;5(2):e59-e62). Transcripts were analyzed using grounded theory to generate a list of relevant themes that were further organized into a theoretical framework. With the use of this framework, a survey was generated and distributed to additional learners who had experienced mannequin death due to action or inaction. Results were analyzed using a mixed methods approach. Forty-one clinicians completed the survey. A correlation was found between the emotional experience of mannequin death and degree of presession anxiety (P < 0.001). Debriefing was found to significantly reduce negative emotion and enhance satisfaction. Sixty-nine percent of respondents indicated that mannequin death enhanced learning. These results were used to modify our framework. Using the previous approach, we created a model of the effect of mannequin death on the educational and psychological state of learners. We offer the final model as a guide to future research regarding the learner experience of mannequin death.

  3. Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework

    PubMed Central

    Talluto, Matthew V.; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C. Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A.; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique

    2016-01-01

    Aim Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Location Eastern North America (as an example). Methods Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple (Acer saccharum), an abundant tree native to eastern North America. Results For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. Main conclusions We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making. PMID:27499698

  4. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976

  5. The nature and outcomes of work: a replication and extension of interdisciplinary work-design research.

    PubMed

    Edwards, J R; Scully, J A; Brtek, M D

    2000-12-01

    Research into the changing nature of work requires comprehensive models of work design. One such model is the interdisciplinary framework (M. A. Campion, 1988), which integrates 4 work-design approaches (motivational, mechanistic, biological, perceptual-motor) and links each approach to specific outcomes. Unfortunately, studies of this framework have used methods that disregard measurement error, overlook dimensions within each work-design approach, and treat each approach and outcome separately. This study reanalyzes data from M. A. Campion (1988), using structural equation models that incorporate measurement error, specify multiple dimensions for each work-design approach, and examine the work-design approaches and outcomes jointly. Results show that previous studies underestimate relationships between work-design approaches and outcomes and that dimensions within each approach exhibit relationships with outcomes that differ in magnitude and direction.

  6. Doctor coach: a deliberate practice approach to teaching and learning clinical skills.

    PubMed

    Gifford, Kimberly A; Fall, Leslie H

    2014-02-01

    The rapidly evolving medical education landscape requires restructuring the approach to teaching and learning across the continuum of medical education. The deliberate practice strategies used to coach learners in disciplines beyond medicine can also be used to train medical learners. However, these deliberate practice strategies are not explicitly taught in most medical schools or residencies. The authors designed the Doctor Coach framework and competencies in 2007-2008 to serve as the foundation for new faculty development and resident-as-teacher programs. In addition to teaching deliberate practice strategies, the programs model a deliberate practice approach that promotes the continuous integration of newly developed coaching competencies by participants into their daily teaching practice. Early evaluation demonstrated the feasibility and efficacy of implementing the Doctor Coach framework across the continuum of medical education. Additionally, the Doctor Coach framework has been disseminated through national workshops, which have resulted in additional institutions applying the framework and competencies to develop their own coaching programs. Design of a multisource evaluation tool based on the coaching competencies will enable more rigorous study of the Doctor Coach framework and training programs and provide a richer feedback mechanism for participants. The framework will also facilitate the faculty development needed to implement the milestones and entrustable professional activities in medical education.

  7. A locally p-adaptive approach for Large Eddy Simulation of compressible flows in a DG framework

    NASA Astrophysics Data System (ADS)

    Tugnoli, Matteo; Abbà, Antonella; Bonaventura, Luca; Restelli, Marco

    2017-11-01

    We investigate the possibility of reducing the computational burden of LES models by employing local polynomial degree adaptivity in the framework of a high-order DG method. A novel degree adaptation technique especially featured to be effective for LES applications is proposed and its effectiveness is compared to that of other criteria already employed in the literature. The resulting locally adaptive approach allows to achieve significant reductions in computational cost of representative LES computations.

  8. Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework

    NASA Astrophysics Data System (ADS)

    Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.

    2016-03-01

    A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.

  9. Developing a curriculum framework for global health in family medicine: emerging principles, competencies, and educational approaches

    PubMed Central

    2011-01-01

    Background Recognizing the growing demand from medical students and residents for more comprehensive global health training, and the paucity of explicit curricula on such issues, global health and curriculum experts from the six Ontario Family Medicine Residency Programs worked together to design a framework for global health curricula in family medicine training programs. Methods A working group comprised of global health educators from Ontario's six medical schools conducted a scoping review of global health curricula, competencies, and pedagogical approaches. The working group then hosted a full day meeting, inviting experts in education, clinical care, family medicine and public health, and developed a consensus process and draft framework to design global health curricula. Through a series of weekly teleconferences over the next six months, the framework was revised and used to guide the identification of enabling global health competencies (behaviours, skills and attitudes) for Canadian Family Medicine training. Results The main outcome was an evidence-informed interactive framework http://globalhealth.ennovativesolution.com/ to provide a shared foundation to guide the design, delivery and evaluation of global health education programs for Ontario's family medicine residency programs. The curriculum framework blended a definition and mission for global health training, core values and principles, global health competencies aligning with the Canadian Medical Education Directives for Specialists (CanMEDS) competencies, and key learning approaches. The framework guided the development of subsequent enabling competencies. Conclusions The shared curriculum framework can support the design, delivery and evaluation of global health curriculum in Canada and around the world, lay the foundation for research and development, provide consistency across programmes, and support the creation of learning and evaluation tools to align with the framework. The process used to develop this framework can be applied to other aspects of residency curriculum development. PMID:21781319

  10. Finding the Intersection of the Learning Organization and Learning Transfer: The Significance of Leadership

    ERIC Educational Resources Information Center

    Kim, Jun Hee; Callahan, Jamie L.

    2013-01-01

    Purpose: This article aims to develop a conceptual framework delineating the key dimension of the learning organization which significantly influences learning transfer. Design/methodology/approach: The conceptual framework was developed by analyzing previous studies and synthesizing the results associated with the following four relationships:…

  11. The Scientist, Philosopher, and Rhetorician: The Three Dimensions of Technical Communication and Technology

    ERIC Educational Resources Information Center

    Garrison, Kevin

    2014-01-01

    Technical communication's attempt to prioritize theories of scholarship and pedagogy has resulted in several authors contributing a three-dimensional framework to approach technology: the instrumental perspective, the critical humanist perspective, and the user-centered perspective [1-3]. This article traces connections between this framework for…

  12. Assessing Quality of Critical Thought in Online Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa; Baltes, Beate; Lynn, Laura Knight

    2009-01-01

    Purpose: The purpose of this paper is to describe a theoretically based coding framework for an integrated analysis and assessment of critical thinking in online discussion. Design/methodology/approach: The critical thinking assessment framework (TAF) is developed through review of theory and previous research, verified by comparing results to…

  13. Towards a Transferable UAV-Based Framework for River Hydromorphological Characterization

    PubMed Central

    González, Rocío Ballesteros; Leinster, Paul; Wright, Ros

    2017-01-01

    The multiple protocols that have been developed to characterize river hydromorphology, partly in response to legislative drivers such as the European Union Water Framework Directive (EU WFD), make the comparison of results obtained in different countries challenging. Recent studies have analyzed the comparability of existing methods, with remote sensing based approaches being proposed as a potential means of harmonizing hydromorphological characterization protocols. However, the resolution achieved by remote sensing products may not be sufficient to assess some of the key hydromorphological features that are required to allow an accurate characterization. Methodologies based on high resolution aerial photography taken from Unmanned Aerial Vehicles (UAVs) have been proposed by several authors as potential approaches to overcome these limitations. Here, we explore the applicability of an existing UAV based framework for hydromorphological characterization to three different fluvial settings representing some of the distinct ecoregions defined by the WFD geographical intercalibration groups (GIGs). The framework is based on the automated recognition of hydromorphological features via tested and validated Artificial Neural Networks (ANNs). Results show that the framework is transferable to the Central-Baltic and Mediterranean GIGs with accuracies in feature identification above 70%. Accuracies of 50% are achieved when the framework is implemented in the Very Large Rivers GIG. The framework successfully identified vegetation, deep water, shallow water, riffles, side bars and shadows for the majority of the reaches. However, further algorithm development is required to ensure a wider range of features (e.g., chutes, structures and erosion) are accurately identified. This study also highlights the need to develop an objective and fit for purpose hydromorphological characterization framework to be adopted within all EU member states to facilitate comparison of results. PMID:28954434

  14. Towards a Transferable UAV-Based Framework for River Hydromorphological Characterization.

    PubMed

    Rivas Casado, Mónica; González, Rocío Ballesteros; Ortega, José Fernando; Leinster, Paul; Wright, Ros

    2017-09-26

    The multiple protocols that have been developed to characterize river hydromorphology, partly in response to legislative drivers such as the European Union Water Framework Directive (EU WFD), make the comparison of results obtained in different countries challenging. Recent studies have analyzed the comparability of existing methods, with remote sensing based approaches being proposed as a potential means of harmonizing hydromorphological characterization protocols. However, the resolution achieved by remote sensing products may not be sufficient to assess some of the key hydromorphological features that are required to allow an accurate characterization. Methodologies based on high resolution aerial photography taken from Unmanned Aerial Vehicles (UAVs) have been proposed by several authors as potential approaches to overcome these limitations. Here, we explore the applicability of an existing UAV based framework for hydromorphological characterization to three different fluvial settings representing some of the distinct ecoregions defined by the WFD geographical intercalibration groups (GIGs). The framework is based on the automated recognition of hydromorphological features via tested and validated Artificial Neural Networks (ANNs). Results show that the framework is transferable to the Central-Baltic and Mediterranean GIGs with accuracies in feature identification above 70%. Accuracies of 50% are achieved when the framework is implemented in the Very Large Rivers GIG. The framework successfully identified vegetation, deep water, shallow water, riffles, side bars and shadows for the majority of the reaches. However, further algorithm development is required to ensure a wider range of features (e.g., chutes, structures and erosion) are accurately identified. This study also highlights the need to develop an objective and fit for purpose hydromorphological characterization framework to be adopted within all EU member states to facilitate comparison of results.

  15. Modeling framework for representing long-term effectiveness of best management practices in addressing hydrology and water quality problems: Framework development and demonstration using a Bayesian method

    NASA Astrophysics Data System (ADS)

    Liu, Yaoze; Engel, Bernard A.; Flanagan, Dennis C.; Gitau, Margaret W.; McMillan, Sara K.; Chaubey, Indrajeet; Singh, Shweta

    2018-05-01

    Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water quality models, a high level and forward-looking modeling framework was developed. The components in the framework consist of establishment period efficiency, starting efficiency, efficiency for each storm event, efficiency between maintenance, and efficiency over the life cycle. Combined, they represent long-term efficiency for a specific type of practice and specific environmental concern (runoff/pollutant). An approach for possible implementation of the framework was discussed. The long-term impacts of grass buffer strips (agricultural BMP) and bioretention systems (urban BMP) in reducing total phosphorus were simulated to demonstrate the framework. Data gaps were captured in estimating the long-term performance of the BMPs. A Bayesian method was used to match the simulated distribution of long-term BMP efficiencies with the observed distribution with the assumption that the observed data represented long-term BMP efficiencies. The simulated distribution matched the observed distribution well with only small total predictive uncertainties. With additional data, the same method can be used to further improve the simulation results. The modeling framework and results of this study, which can be adopted in hydrologic/water quality models to better represent long-term BMP effectiveness, can help improve decision support systems for creating long-term stormwater management strategies for watershed management projects.

  16. Definition and use of Solution-focused Sustainability Assessment: A novel approach to generate, explore and decide on sustainable solutions for wicked problems.

    PubMed

    Zijp, Michiel C; Posthuma, Leo; Wintersen, Arjen; Devilee, Jeroen; Swartjes, Frank A

    2016-05-01

    This paper introduces Solution-focused Sustainability Assessment (SfSA), provides practical guidance formatted as a versatile process framework, and illustrates its utility for solving a wicked environmental management problem. Society faces complex and increasingly wicked environmental problems for which sustainable solutions are sought. Wicked problems are multi-faceted, and deriving of a management solution requires an approach that is participative, iterative, innovative, and transparent in its definition of sustainability and translation to sustainability metrics. We suggest to add the use of a solution-focused approach. The SfSA framework is collated from elements from risk assessment, risk governance, adaptive management and sustainability assessment frameworks, expanded with the 'solution-focused' paradigm as recently proposed in the context of risk assessment. The main innovation of this approach is the broad exploration of solutions upfront in assessment projects. The case study concerns the sustainable management of slightly contaminated sediments continuously formed in ditches in rural, agricultural areas. This problem is wicked, as disposal of contaminated sediment on adjacent land is potentially hazardous to humans, ecosystems and agricultural products. Non-removal would however reduce drainage capacity followed by increased risks of flooding, while contaminated sediment removal followed by offsite treatment implies high budget costs and soil subsidence. Application of the steps in the SfSA-framework served in solving this problem. Important elements were early exploration of a wide 'solution-space', stakeholder involvement from the onset of the assessment, clear agreements on the risk and sustainability metrics of the problem and on the interpretation and decision procedures, and adaptive management. Application of the key elements of the SfSA approach eventually resulted in adoption of a novel sediment management policy. The stakeholder participation and the intensive communication throughout the project resulted in broad support for both the scientific approaches and results, as well as for policy implementation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Towards a Collaborative Filtering Approach to Medication Reconciliation

    PubMed Central

    Hasan, Sharique; Duncan, George T.; Neill, Daniel B.; Padman, Rema

    2008-01-01

    A physician’s prescribing decisions depend on knowledge of the patient’s medication list. This knowledge is often incomplete, and errors or omissions could result in adverse outcomes. To address this problem, the Joint Commission recommends medication reconciliation for creating a more accurate list of a patient’s medications. In this paper, we develop techniques for automatic detection of omissions in medication lists, identifying drugs that the patient may be taking but are not on the patient’s medication list. Our key insight is that this problem is analogous to the collaborative filtering framework increasingly used by online retailers to recommend relevant products to customers. The collaborative filtering approach enables a variety of solution techniques, including nearest neighbor and co-occurrence approaches. We evaluate the effectiveness of these approaches using medication data from a long-term care center in the Eastern US. Preliminary results suggest that this framework may become a valuable tool for medication reconciliation. PMID:18998834

  18. Towards a collaborative filtering approach to medication reconciliation.

    PubMed

    Hasan, Sharique; Duncan, George T; Neill, Daniel B; Padman, Rema

    2008-11-06

    A physicians prescribing decisions depend on knowledge of the patients medication list. This knowledge is often incomplete, and errors or omissions could result in adverse outcomes. To address this problem, the Joint Commission recommends medication reconciliation for creating a more accurate list of a patients medications. In this paper, we develop techniques for automatic detection of omissions in medication lists, identifying drugs that the patient may be taking but are not on the patients medication list. Our key insight is that this problem is analogous to the collaborative filtering framework increasingly used by online retailers to recommend relevant products to customers. The collaborative filtering approach enables a variety of solution techniques, including nearest neighbor and co-occurrence approaches. We evaluate the effectiveness of these approaches using medication data from a long-term care center in the Eastern US. Preliminary results suggest that this framework may become a valuable tool for medication reconciliation.

  19. Towards a Framework for Evaluating and Comparing Diagnosis Algorithms

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia,David; Kuhn, Lukas; deKleer, Johan; vanGemund, Arjan; Feldman, Alexander

    2009-01-01

    Diagnostic inference involves the detection of anomalous system behavior and the identification of its cause, possibly down to a failed unit or to a parameter of a failed unit. Traditional approaches to solving this problem include expert/rule-based, model-based, and data-driven methods. Each approach (and various techniques within each approach) use different representations of the knowledge required to perform the diagnosis. The sensor data is expected to be combined with these internal representations to produce the diagnosis result. In spite of the availability of various diagnosis technologies, there have been only minimal efforts to develop a standardized software framework to run, evaluate, and compare different diagnosis technologies on the same system. This paper presents a framework that defines a standardized representation of the system knowledge, the sensor data, and the form of the diagnosis results and provides a run-time architecture that can execute diagnosis algorithms, send sensor data to the algorithms at appropriate time steps from a variety of sources (including the actual physical system), and collect resulting diagnoses. We also define a set of metrics that can be used to evaluate and compare the performance of the algorithms, and provide software to calculate the metrics.

  20. A framework for self-experimentation in personalized health.

    PubMed

    Karkar, Ravi; Zia, Jasmine; Vilardaga, Roger; Mishra, Sonali R; Fogarty, James; Munson, Sean A; Kientz, Julie A

    2016-05-01

    To describe an interdisciplinary and methodological framework for applying single case study designs to self-experimentation in personalized health. The authors examine the framework's applicability to various health conditions and present an initial case study with irritable bowel syndrome (IBS). An in-depth literature review was performed to develop the framework and to identify absolute and desired health condition requirements for the application of this framework. The authors developed mobile application prototypes, storyboards, and process flows of the framework using IBS as the case study. The authors conducted three focus groups and an online survey using a human-centered design approach for assessing the framework's feasibility. All 6 focus group participants had a positive view about our framework and volunteered to participate in future studies. Most stated they would trust the results because it was their own data being analyzed. They were most concerned about confounds, nonmeaningful measures, and erroneous assumptions on the timing of trigger effects. Survey respondents (N = 60) were more likely to be adherent to an 8- vs 12-day study length even if it meant lower confidence results. Implementation of the self-experimentation framework in a mobile application appears to be feasible for people with IBS. This framework can likely be applied to other health conditions. Considerations include the learning curve for teaching self-experimentation to non-experts and the challenges involved in operationalizing and customizing study designs. Using mobile technology to guide people through self-experimentation to investigate health questions is a feasible and promising approach to advancing personalized health. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Non-extensitivity vs. informative moments for financial models —A unifying framework and empirical results

    NASA Astrophysics Data System (ADS)

    Herrmann, K.

    2009-11-01

    Information-theoretic approaches still play a minor role in financial market analysis. Nonetheless, there have been two very similar approaches evolving during the last years, one in the so-called econophysics and the other in econometrics. Both generalize the notion of GARCH processes in an information-theoretic sense and are able to capture kurtosis better than traditional models. In this article we present both approaches in a more general framework. The latter allows the derivation of a wide range of new models. We choose a third model using an entropy measure suggested by Kapur. In an application to financial market data, we find that all considered models - with similar flexibility in terms of skewness and kurtosis - lead to very similar results.

  2. Putting Public Health Ethics into Practice: A Systematic Framework

    PubMed Central

    Marckmann, Georg; Schmidt, Harald; Sofaer, Neema; Strech, Daniel

    2015-01-01

    It is widely acknowledged that public health practice raises ethical issues that require a different approach than traditional biomedical ethics. Several frameworks for public health ethics (PHE) have been proposed; however, none of them provides a practice-oriented combination of the two necessary components: (1) a set of normative criteria based on an explicit ethical justification and (2) a structured methodological approach for applying the resulting normative criteria to concrete public health (PH) issues. Building on prior work in the field and integrating valuable elements of other approaches to PHE, we present a systematic ethical framework that shall guide professionals in planning, conducting, and evaluating PH interventions. Based on a coherentist model of ethical justification, the proposed framework contains (1) an explicit normative foundation with five substantive criteria and seven procedural conditions to guarantee a fair decision process, and (2) a six-step methodological approach for applying the criteria and conditions to the practice of PH and health policy. The framework explicitly ties together ethical analysis and empirical evidence, thus striving for evidence-based PHE. It can provide normative guidance to those who analyze the ethical implications of PH practice including academic ethicists, health policy makers, health technology assessment bodies, and PH professionals. It will enable those who implement a PH intervention and those affected by it (i.e., the target population) to critically assess whether and how the required ethical considerations have been taken into account. Thereby, the framework can contribute to assuring the quality of ethical analysis in PH. Whether the presented framework will be able to achieve its goals has to be determined by evaluating its practical application. PMID:25705615

  3. Towards a Framework for Developing Semantic Relatedness Reference Standards

    PubMed Central

    Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.

    2010-01-01

    Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697

  4. A Psychometric Framework for the Evaluation of Instructional Sensitivity

    ERIC Educational Resources Information Center

    Naumann, Alexander; Hochweber, Jan; Klieme, Eckhard

    2016-01-01

    Although there is a common understanding of instructional sensitivity, it lacks a common operationalization. Various approaches have been proposed, some focusing on item responses, others on test scores. As approaches often do not produce consistent results, previous research has created the impression that approaches to instructional sensitivity…

  5. Approaches to Learning and Study Orchestrations in High School Students

    ERIC Educational Resources Information Center

    Cano, Francisco

    2007-01-01

    In the framework of the SAL (Students' approaches to learning) position, the learning experience (approaches to learning and study orchestrations) of 572 high school students was explored, examining its interrelationships with some personal and familial variables. Three major results emerged. First, links were found between family's intellectual…

  6. Reaping the benefits of an open systems approach: getting the commercial approach right

    NASA Astrophysics Data System (ADS)

    Pearson, Gavin; Dawe, Tony; Stubbs, Peter; Worthington, Olwen

    2016-05-01

    Critical to reaping the benefits of an Open System Approach within Defence, or any other sector, is the ability to design the appropriate commercial model (or framework). This paper reports on the development and testing of a commercial strategy decision support tool. The tool set comprises a number of elements, including a process model, and provides business intelligence insights into likely supplier behaviour. The tool has been developed by subject matter experts and has been tested with a number of UK Defence procurement teams. The paper will present the commercial model framework, the elements of the toolset and the results of testing.

  7. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram

    PubMed Central

    2015-01-01

    Objectives Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. Methods This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. Results It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. Conclusions To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions. PMID:26618028

  8. Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework.

    PubMed

    Talluto, Matthew V; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique

    2016-02-01

    Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Eastern North America (as an example). Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple ( Acer saccharum ), an abundant tree native to eastern North America. For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making.

  9. A Generalized Mixture Framework for Multi-label Classification

    PubMed Central

    Hong, Charmgil; Batal, Iyad; Hauskrecht, Milos

    2015-01-01

    We develop a novel probabilistic ensemble framework for multi-label classification that is based on the mixtures-of-experts architecture. In this framework, we combine multi-label classification models in the classifier chains family that decompose the class posterior distribution P(Y1, …, Yd|X) using a product of posterior distributions over components of the output space. Our approach captures different input–output and output–output relations that tend to change across data. As a result, we can recover a rich set of dependency relations among inputs and outputs that a single multi-label classification model cannot capture due to its modeling simplifications. We develop and present algorithms for learning the mixtures-of-experts models from data and for performing multi-label predictions on unseen data instances. Experiments on multiple benchmark datasets demonstrate that our approach achieves highly competitive results and outperforms the existing state-of-the-art multi-label classification methods. PMID:26613069

  10. PathoScope 2.0: a complete computational framework for strain identification in environmental or clinical sequencing samples

    PubMed Central

    2014-01-01

    Background Recent innovations in sequencing technologies have provided researchers with the ability to rapidly characterize the microbial content of an environmental or clinical sample with unprecedented resolution. These approaches are producing a wealth of information that is providing novel insights into the microbial ecology of the environment and human health. However, these sequencing-based approaches produce large and complex datasets that require efficient and sensitive computational analysis workflows. Many recent tools for analyzing metagenomic-sequencing data have emerged, however, these approaches often suffer from issues of specificity, efficiency, and typically do not include a complete metagenomic analysis framework. Results We present PathoScope 2.0, a complete bioinformatics framework for rapidly and accurately quantifying the proportions of reads from individual microbial strains present in metagenomic sequencing data from environmental or clinical samples. The pipeline performs all necessary computational analysis steps; including reference genome library extraction and indexing, read quality control and alignment, strain identification, and summarization and annotation of results. We rigorously evaluated PathoScope 2.0 using simulated data and data from the 2011 outbreak of Shiga-toxigenic Escherichia coli O104:H4. Conclusions The results show that PathoScope 2.0 is a complete, highly sensitive, and efficient approach for metagenomic analysis that outperforms alternative approaches in scope, speed, and accuracy. The PathoScope 2.0 pipeline software is freely available for download at: http://sourceforge.net/projects/pathoscope/. PMID:25225611

  11. Grounding Robot Autonomy in Emotion and Self-awareness

    NASA Astrophysics Data System (ADS)

    Sanz, Ricardo; Hernández, Carlos; Hernando, Adolfo; Gómez, Jaime; Bermejo, Julita

    Much is being done in an attempt to transfer emotional mechanisms from reverse-engineered biology into social robots. There are two basic approaches: the imitative display of emotion —e.g. to intend more human-like robots— and the provision of architectures with intrinsic emotion —in the hope of enhancing behavioral aspects. This paper focuses on the second approach, describing a core vision regarding the integration of cognitive, emotional and autonomic aspects in social robot systems. This vision has evolved as a result of the efforts in consolidating the models extracted from rat emotion research and their implementation in technical use cases based on a general systemic analysis in the framework of the ICEA and C3 projects. The desire for generality of the approach intends obtaining universal theories of integrated —autonomic, emotional, cognitive— behavior. The proposed conceptualizations and architectural principles are then captured in a theoretical framework: ASys — The Autonomous Systems Framework.

  12. [Towards a social determinants-oriented approach to public health: workshop report].

    PubMed

    Rojo, Elena González; Álvarez-Dardet, Carlos; Fernández, Luis Andrés López

    2017-12-01

    This article is the result of a workshop with public health experts held in Granada (Spain) in October 2015 in order to reflect upon the components of the framework that should be part of a public health approach based on the social determinants of health. Advocacy and training professionals in health advocacy were identified as key elements where this was needed. During the workshop, it was agreed that the gender perspective, the salutogenic approach, interdisciplinary work and particular attention to disadvantaged groups are crucial. The importance of working from a human rights' framework and promoting legislative changes were also mentioned. Moreover, the group mentioned that even though much progress has been made identifying social determinants of health and creating conceptual frameworks, there is limited knowledge about how to intervene to reduce health inequality gaps in our societies. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Testing the suitability of geologic frameworks for extrapolating hydraulic properties across regional scales

    USGS Publications Warehouse

    Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald; Fenelon, Joseph M.

    2016-01-01

    The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks provide the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. Testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.

  14. [Development of a Conceptual Framework for the Assessment of Chronic Care in the Spanish National Health System].

    PubMed

    Espallargues, Mireia; Serra-Sutton, Vicky; Solans-Domènech, Maite; Torrente, Elena; Moharra, Montse; Benítez, Dolors; Robles, Noemí; Domíngo, Laia; Escarrabill Sanglas, Joan

    2016-07-07

    The aim was to develop a conceptual framework for the assessment of new healthcare initiatives on chronic diseases within the Spanish National Health System. A comprehensive literature review between 2002 and 2013, including systematic reviews, meta-analysis, and reports with evaluation frameworks and/or assessment of initiatives was carried out; integrated care initiatives established in Catalonia were studied and described; and semistructured interviews with key stakeholders were performed. The scope and conceptual framework were defined by using the brainstorming approach.Of 910 abstracts identified, a total of 116 studies were included. They referred to several conceptual frameworks and/or assessment indicators at a national and international level. An overall of 24 established chronic care initiatives were identified (9 integrated care initiatives); 10 in-depth interviews were carried out. The proposed conceptual framework envisages: 1)the target population according to complexity levels; 2)an evaluation approach of the structure, processes, and outcomes considering the health status achieved, the recovery process and the maintenance of health; and 3)the dimensions or attributes to be assessed. The proposed conceptual framework will be helpful has been useful to develop indicators and implement them with a community-based and result-oriented approach and a territorial or population-based perspective within the Spanish Health System. This will be essential to know which are the most effective strategies, what are the key elements that determine greater success and what are the groups of patients who can most benefit.

  15. A novel performance monitoring framework for health research systems: experiences of the National Institute for Health Research in England

    PubMed Central

    2011-01-01

    Background The National Institute for Health Research (NIHR) was established in 2006 with the aim of creating an applied health research system embedded within the English National Health Service (NHS). NIHR sought to implement an approach for monitoring its performance that effectively linked early indicators of performance with longer-term research impacts. We attempted to develop and apply a conceptual framework for defining appropriate key performance indicators for NIHR. Method Following a review of relevant literature, a conceptual framework for defining performance indicators for NIHR was developed, based on a hybridisation of the logic model and balanced scorecard approaches. This framework was validated through interviews with key NIHR stakeholders and a pilot in one division of NIHR, before being refined and applied more widely. Indicators were then selected and aggregated to create a basket of indicators aligned to NIHR's strategic goals, which could be reported to NIHR's leadership team on a quarterly basis via an oversight dashboard. Results Senior health research system managers and practitioners endorsed the conceptual framework developed and reported satisfaction with the breadth and balance of indicators selected for reporting. Conclusions The use of the hybrid conceptual framework provides a pragmatic approach to defining performance indicators that are aligned to the strategic aims of a health research system. The particular strength of this framework is its capacity to provide an empirical link, over time, between upstream activities of a health research system and its long-term strategic objectives. PMID:21435265

  16. A New Framework for Effective and Efficient Global Sensitivity Analysis of Earth and Environmental Systems Models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin

    2015-04-01

    Earth and Environmental Systems (EES) models are essential components of research, development, and decision-making in science and engineering disciplines. With continuous advances in understanding and computing power, such models are becoming more complex with increasingly more factors to be specified (model parameters, forcings, boundary conditions, etc.). To facilitate better understanding of the role and importance of different factors in producing the model responses, the procedure known as 'Sensitivity Analysis' (SA) can be very helpful. Despite the availability of a large body of literature on the development and application of various SA approaches, two issues continue to pose major challenges: (1) Ambiguous Definition of Sensitivity - Different SA methods are based in different philosophies and theoretical definitions of sensitivity, and can result in different, even conflicting, assessments of the underlying sensitivities for a given problem, (2) Computational Cost - The cost of carrying out SA can be large, even excessive, for high-dimensional problems and/or computationally intensive models. In this presentation, we propose a new approach to sensitivity analysis that addresses the dual aspects of 'effectiveness' and 'efficiency'. By effective, we mean achieving an assessment that is both meaningful and clearly reflective of the objective of the analysis (the first challenge above), while by efficiency we mean achieving statistically robust results with minimal computational cost (the second challenge above). Based on this approach, we develop a 'global' sensitivity analysis framework that efficiently generates a newly-defined set of sensitivity indices that characterize a range of important properties of metric 'response surfaces' encountered when performing SA on EES models. Further, we show how this framework embraces, and is consistent with, a spectrum of different concepts regarding 'sensitivity', and that commonly-used SA approaches (e.g., Sobol, Morris, etc.) are actually limiting cases of our approach under specific conditions. Multiple case studies are used to demonstrate the value of the new framework. The results show that the new framework provides a fundamental understanding of the underlying sensitivities for any given problem, while requiring orders of magnitude fewer model runs.

  17. Clinical Decision Support-based Quality Measurement (CDS-QM) Framework: Prototype Implementation, Evaluation, and Future Directions

    PubMed Central

    Kukhareva, Polina V; Kawamoto, Kensaku; Shields, David E; Barfuss, Darryl T; Halley, Anne M; Tippetts, Tyler J; Warner, Phillip B; Bray, Bruce E; Staes, Catherine J

    2014-01-01

    Electronic quality measurement (QM) and clinical decision support (CDS) are closely related but are typically implemented independently, resulting in significant duplication of effort. While it seems intuitive that technical approaches could be re-used across these two related use cases, such reuse is seldom reported in the literature, especially for standards-based approaches. Therefore, we evaluated the feasibility of using a standards-based CDS framework aligned with anticipated EHR certification criteria to implement electronic QM. The CDS-QM framework was used to automate a complex national quality measure (SCIP-VTE-2) at an academic healthcare system which had previously relied on time-consuming manual chart abstractions. Compared with 305 manually-reviewed reference cases, the recall of automated measurement was 100%. The precision was 96.3% (CI:92.6%-98.5%) for ascertaining the denominator and 96.2% (CI:92.3%-98.4%) for the numerator. We therefore validated that a standards-based CDS-QM framework can successfully enable automated QM, and we identified benefits and challenges with this approach. PMID:25954389

  18. Highlighting differences between conditional and unconditional quantile regression approaches through an application to assess medication adherence.

    PubMed

    Borah, Bijan J; Basu, Anirban

    2013-09-01

    The quantile regression (QR) framework provides a pragmatic approach in understanding the differential impacts of covariates along the distribution of an outcome. However, the QR framework that has pervaded the applied economics literature is based on the conditional quantile regression method. It is used to assess the impact of a covariate on a quantile of the outcome conditional on specific values of other covariates. In most cases, conditional quantile regression may generate results that are often not generalizable or interpretable in a policy or population context. In contrast, the unconditional quantile regression method provides more interpretable results as it marginalizes the effect over the distributions of other covariates in the model. In this paper, the differences between these two regression frameworks are highlighted, both conceptually and econometrically. Additionally, using real-world claims data from a large US health insurer, alternative QR frameworks are implemented to assess the differential impacts of covariates along the distribution of medication adherence among elderly patients with Alzheimer's disease. Copyright © 2013 John Wiley & Sons, Ltd.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marquez, Andres; Manzano Franco, Joseph B.; Song, Shuaiwen

    With Exascale performance and its challenges in mind, one ubiquitous concern among architects is energy efficiency. Petascale systems projected to Exascale systems are unsustainable at current power consumption rates. One major contributor to system-wide power consumption is the number of memory operations leading to data movement and management techniques applied by the runtime system. To address this problem, we present the concept of the Architected Composite Data Types (ACDT) framework. The framework is made aware of data composites, assigning them a specific layout, transformations and operators. Data manipulation overhead is amortized over a larger number of elements and program performancemore » and power efficiency can be significantly improved. We developed the fundamentals of an ACDT framework on a massively multithreaded adaptive runtime system geared towards Exascale clusters. Showcasing the capability of ACDT, we exercised the framework with two representative processing kernels - Matrix Vector Multiply and the Cholesky Decomposition – applied to sparse matrices. As transformation modules, we applied optimized compress/decompress engines and configured invariant operators for maximum energy/performance efficiency. Additionally, we explored two different approaches based on transformation opaqueness in relation to the application. Under the first approach, the application is agnostic to compression and decompression activity. Such approach entails minimal changes to the original application code, but leaves out potential applicationspecific optimizations. The second approach exposes the decompression process to the application, hereby exposing optimization opportunities that can only be exploited with application knowledge. The experimental results show that the two approaches have their strengths in HW and SW respectively, where the SW approach can yield performance and power improvements that are an order of magnitude better than ACDT-oblivious, hand-optimized implementations.We consider the ACDT runtime framework an important component of compute nodes that will lead towards power efficient Exascale clusters.« less

  20. Market-Based Coordination of Thermostatically Controlled Loads—Part II: Unknown Parameters and Case Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Sen; Zhang, Wei; Lian, Jianming

    This two-part paper considers the coordination of a population of Thermostatically Controlled Loads (TCLs) with unknown parameters to achieve group objectives. The problem involves designing the bidding and market clearing strategy to motivate self-interested users to realize efficient energy allocation subject to a peak power constraint. The companion paper (Part I) formulates the problem and proposes a load coordination framework using the mechanism design approach. To address the unknown parameters, Part II of this paper presents a joint state and parameter estimation framework based on the expectation maximization algorithm. The overall framework is then validated using real-world weather data andmore » price data, and is compared with other approaches in terms of aggregated power response. Simulation results indicate that our coordination framework can effectively improve the efficiency of the power grid operations and reduce power congestion at key times.« less

  1. An appraisal of theoretical approaches to examining behaviours in relation to Human Papillomavirus (HPV) vaccination of young women

    PubMed Central

    Batista Ferrer, Harriet; Audrey, Suzanne; Trotter, Caroline; Hickman, Matthew

    2015-01-01

    Background Interventions to increase uptake of Human Papillomavirus (HPV) vaccination by young women may be more effective if they are underpinned by an appropriate theoretical model or framework. The aims of this review were: to describe the theoretical models or frameworks used to explain behaviours in relation to HPV vaccination of young women, and: to consider the appropriateness of the theoretical models or frameworks used for informing the development of interventions to increase uptake. Methods Primary studies were identified through a comprehensive search of databases from inception to December 2013. Results Thirty-four relevant studies were identified, of which 31 incorporated psychological health behaviour models or frameworks and three used socio-cultural models or theories. The primary studies used a variety of approaches to measure a diverse range of outcomes in relation to behaviours of professionals, parents, and young women. The majority appeared to use theory appropriately throughout. About half of the quantitative studies presented data in relation to goodness of fit tests and the proportion of the variability in the data. Conclusion Due to diverse approaches and inconsistent findings across studies, the current contribution of theory to understanding and promoting HPV vaccination uptake is difficult to assess. Ecological frameworks encourage the integration of individual and social approaches by encouraging exploration of the intrapersonal, interpersonal, organisational, community and policy levels when examining public health issues. Given the small number of studies using such approach, combined with the importance of these factors in predicting behaviour, more research in this area is warranted. PMID:26314783

  2. Evaluating the substantive effectiveness of SEA: Towards a better understanding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doren, D. van; Driessen, P.P.J., E-mail: p.driessen@uu.nl; Schijf, B.

    Evaluating the substantive effectiveness of strategic environmental assessment (SEA) is vital in order to know to what extent the tool fulfills its purposes and produces expected results. However, the studies that have evaluated the substantive effectiveness of SEA produce varying outcomes as regards the tool's contribution to decision-making and have used a variety of approaches to appraise its effectiveness. The aim of this article is to discuss the theoretical concept of SEA substantive effectiveness and to present a new approach that can be applied for evaluation studies. The SEA effectiveness evaluation framework that will be presented is composed of conceptsmore » of, and approaches to, SEA effectiveness derived from SEA literature and planning theory. Lessons for evaluation can be learned from planning theory in particular, given its long history of analyzing and understanding how sources of information and decisions affect (subsequent) decision-making. Key concepts of this new approach are 'conformance' and 'performance'. In addition, this article presents a systematic overview of process and context factors that can explain SEA effectiveness, derived from SEA literature. To illustrate the practical value of our framework for the assessment and understanding of substantive effectiveness of SEA, three Dutch SEA case studies are examined. The case studies have confirmed the usefulness of the SEA effectiveness assessment framework. The framework proved helpful in order to describe the cumulative influence of the three SEAs on decision-making and the ultimate plan. - Highlights: Black-Right-Pointing-Pointer A new framework to evaluate the substantive effectiveness of SEA is presented. Black-Right-Pointing-Pointer The framework is based on two key concepts: 'conformance' and 'performance.' Black-Right-Pointing-Pointer The practical applicability of the framework is demonstrated by three Dutch cases. Black-Right-Pointing-Pointer The framework allows for a more systematic understanding of SEA effectiveness. Black-Right-Pointing-Pointer Finally, this paper presents explanations for SEA effectiveness.« less

  3. A framework for conducting mechanistic based reliability assessments of components operating in complex systems

    NASA Astrophysics Data System (ADS)

    Wallace, Jon Michael

    2003-10-01

    Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.

  4. Awareness, adoption, and application of the Association of College & Research Libraries (ACRL) Framework for Information Literacy in health sciences libraries.

    PubMed

    Schulte, Stephanie J; Knapp, Maureen

    2017-10-01

    In early 2016, the Association of College & Research Libraries (ACRL) officially adopted a conceptual Framework for Information Literacy (Framework) that was a significant shift away from the previous standards-based approach. This study sought to determine (1) if health sciences librarians are aware of the recent Framework for Information Literacy; (2) if they have used the Framework to change their instruction or communication with faculty, and if so, what changes have taken place; and (3) if certain librarian characteristics are associated with the likelihood of adopting the Framework. This study utilized a descriptive electronic survey. Half of all respondents were aware of and were using or had plans to use the Framework. Academic health sciences librarians and general academic librarians were more likely than hospital librarians to be aware of the Framework. Those using the Framework were mostly revising and creating content, revising their teaching approach, and learning more about the Framework. Framework users commented that it was influencing how they thought about and discussed information literacy with faculty and students. Most hospital librarians and half the academic health sciences librarians were not using and had no plans to use the Framework. Librarians with more than twenty years of experience were less likely to be aware of the Framework and more likely to have no plans to use it. Common reasons for not using the Framework were lack of awareness of a new version and lack of involvement in formal instruction. The results suggest that there is room to improve awareness and application of the Framework among health sciences librarians.

  5. Primary Care Performance Measurement and Reporting at a Regional Level: Could a Matrix Approach Provide Actionable Information for Policy Makers and Clinicians?

    PubMed Central

    Langton, Julia M.; Wong, Sabrina T.; Johnston, Sharon; Abelson, Julia; Ammi, Mehdi; Burge, Fred; Campbell, John; Haggerty, Jeannie; Hogg, William; Wodchis, Walter P.

    2016-01-01

    Objective: Primary care services form the foundation of modern healthcare systems, yet the breadth and complexity of services and diversity of patient populations may present challenges for creating comprehensive primary care information systems. Our objective is to develop regional-level information on the performance of primary care in Canada. Methods: A scoping review was conducted to identify existing initiatives in primary care performance measurement and reporting across 11 countries. The results of this review were used by our international team of primary care researchers and clinicians to propose an approach for regional-level primary care reporting. Results: We found a gap between conceptual primary care performance measurement frameworks in the peer-reviewed literature and real-world primary care performance measurement and reporting activities. We did not find a conceptual framework or analytic approach that could readily form the foundation of a regional-level primary care information system. Therefore, we propose an approach to reporting comprehensive and actionable performance information according to widely accepted core domains of primary care as well as different patient population groups. Conclusions: An approach that bridges the gap between conceptual frameworks and real-world performance measurement and reporting initiatives could address some of the potential pitfalls of existing ways of presenting performance information (i.e., by single diseases or by age). This approach could produce meaningful and actionable information on the quality of primary care services. PMID:28032823

  6. Spatio-temporal Granger causality: a new framework

    PubMed Central

    Luo, Qiang; Lu, Wenlian; Cheng, Wei; Valdes-Sosa, Pedro A.; Wen, Xiaotong; Ding, Mingzhou; Feng, Jianfeng

    2015-01-01

    That physiological oscillations of various frequencies are present in fMRI signals is the rule, not the exception. Herein, we propose a novel theoretical framework, spatio-temporal Granger causality, which allows us to more reliably and precisely estimate the Granger causality from experimental datasets possessing time-varying properties caused by physiological oscillations. Within this framework, Granger causality is redefined as a global index measuring the directed information flow between two time series with time-varying properties. Both theoretical analyses and numerical examples demonstrate that Granger causality is a monotonically increasing function of the temporal resolution used in the estimation. This is consistent with the general principle of coarse graining, which causes information loss by smoothing out very fine-scale details in time and space. Our results confirm that the Granger causality at the finer spatio-temporal scales considerably outperforms the traditional approach in terms of an improved consistency between two resting-state scans of the same subject. To optimally estimate the Granger causality, the proposed theoretical framework is implemented through a combination of several approaches, such as dividing the optimal time window and estimating the parameters at the fine temporal and spatial scales. Taken together, our approach provides a novel and robust framework for estimating the Granger causality from fMRI, EEG, and other related data. PMID:23643924

  7. A framework for scalable parameter estimation of gene circuit models using structural information.

    PubMed

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-07-01

    Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.

  8. Classifying indicators of quality: a collaboration between Dutch and English regulators.

    PubMed

    Mears, Alex; Vesseur, Jan; Hamblin, Richard; Long, Paul; Den Ouden, Lya

    2011-12-01

    Many approaches to measuring quality in healthcare exist, generally employing indicators or metrics. While there are important differences, most of these approaches share three key areas of measurement: safety, effectiveness and patient experience. The European Partnership for Supervisory Organisations in Health Services and Social Care (EPSO) exists as a working group and discussion forum for European regulators. This group undertook to identify a common framework within which European approaches to indicators could be compared. A framework was developed to classify indicators, using four sets of criteria: conceptualization of quality, Donabedian definition (structure, process, outcome), data type (derivable, collectable from routine sources, special collections, samples) and data use (judgement (singular or part of framework) benchmarking, risk assessment). Indicators from English and Dutch hospital measurement programmes were put into the framework, showing areas of agreement and levels of comparability. In the first instance, results are only illustrative. The EPSO has been a powerful driver for undertaking cross-European research, and this project is the first of many to take advantage of the access to international expertize. It has shown that through development of a framework that deconstructs national indicators, commonalities can be identified. Future work will attempt to incorporate other nations' indicators, and attempt cross-national comparison.

  9. Fault Injection Campaign for a Fault Tolerant Duplex Framework

    NASA Technical Reports Server (NTRS)

    Sacco, Gian Franco; Ferraro, Robert D.; von llmen, Paul; Rennels, Dave A.

    2007-01-01

    Fault tolerance is an efficient approach adopted to avoid or reduce the damage of a system failure. In this work we present the results of a fault injection campaign we conducted on the Duplex Framework (DF). The DF is a software developed by the UCLA group [1, 2] that uses a fault tolerant approach and allows to run two replicas of the same process on two different nodes of a commercial off-the-shelf (COTS) computer cluster. A third process running on a different node, constantly monitors the results computed by the two replicas, and eventually restarts the two replica processes if an inconsistency in their computation is detected. This approach is very cost efficient and can be adopted to control processes on spacecrafts where the fault rate produced by cosmic rays is not very high.

  10. Word-level language modeling for P300 spellers based on discriminative graphical models

    NASA Astrophysics Data System (ADS)

    Delgado Saa, Jaime F.; de Pesters, Adriana; McFarland, Dennis; Çetin, Müjdat

    2015-04-01

    Objective. In this work we propose a probabilistic graphical model framework that uses language priors at the level of words as a mechanism to increase the performance of P300-based spellers. Approach. This paper is concerned with brain-computer interfaces based on P300 spellers. Motivated by P300 spelling scenarios involving communication based on a limited vocabulary, we propose a probabilistic graphical model framework and an associated classification algorithm that uses learned statistical models of language at the level of words. Exploiting such high-level contextual information helps reduce the error rate of the speller. Main results. Our experimental results demonstrate that the proposed approach offers several advantages over existing methods. Most importantly, it increases the classification accuracy while reducing the number of times the letters need to be flashed, increasing the communication rate of the system. Significance. The proposed approach models all the variables in the P300 speller in a unified framework and has the capability to correct errors in previous letters in a word, given the data for the current one. The structure of the model we propose allows the use of efficient inference algorithms, which in turn makes it possible to use this approach in real-time applications.

  11. A Rigorous Framework for Optimization of Expensive Functions by Surrogates

    NASA Technical Reports Server (NTRS)

    Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.

  12. Proscene: A feature-rich framework for interactive environments

    NASA Astrophysics Data System (ADS)

    Charalambos, Jean Pierre

    We introduce Proscene, a feature-rich, open-source framework for interactive environments. The design of Proscene comprises a three-layered onion-like software architecture, promoting different possible development scenarios. The framework innermost layer decouples user gesture parsing from user-defined actions. The in-between layer implements a feature-rich set of widely-used motion actions allowing the selection and manipulation of objects, including the scene viewpoint. The outermost layer exposes those features as a Processing library. The results have shown the feasibility of our approach together with the simplicity and flexibility of the Proscene framework API.

  13. A comparison of item response models for accuracy and speed of item responses with applications to adaptive testing.

    PubMed

    van Rijn, Peter W; Ali, Usama S

    2017-05-01

    We compare three modelling frameworks for accuracy and speed of item responses in the context of adaptive testing. The first framework is based on modelling scores that result from a scoring rule that incorporates both accuracy and speed. The second framework is the hierarchical modelling approach developed by van der Linden (2007, Psychometrika, 72, 287) in which a regular item response model is specified for accuracy and a log-normal model for speed. The third framework is the diffusion framework in which the response is assumed to be the result of a Wiener process. Although the three frameworks differ in the relation between accuracy and speed, one commonality is that the marginal model for accuracy can be simplified to the two-parameter logistic model. We discuss both conditional and marginal estimation of model parameters. Models from all three frameworks were fitted to data from a mathematics and spelling test. Furthermore, we applied a linear and adaptive testing mode to the data off-line in order to determine differences between modelling frameworks. It was found that a model from the scoring rule framework outperformed a hierarchical model in terms of model-based reliability, but the results were mixed with respect to correlations with external measures. © 2017 The British Psychological Society.

  14. Selective gas capture via kinetic trapping

    DOE PAGES

    Kundu, Joyjit; Pascal, Tod; Prendergast, David; ...

    2016-07-13

    Conventional approaches to the capture of CO 2 by metal-organic frameworks focus on equilibrium conditions, and frameworks that contain little CO 2 in equilibrium are often rejected as carbon-capture materials. Here we use a statistical mechanical model, parameterized by quantum mechanical data, to suggest that metal-organic frameworks can be used to separate CO 2 from a typical flue gas mixture when used under nonequilibrium conditions. The origin of this selectivity is an emergent gas-separation mechanism that results from the acquisition by different gas types of different mobilities within a crowded framework. The resulting distribution of gas types within the frameworkmore » is in general spatially and dynamically heterogeneous. Our results suggest that relaxing the requirement of equilibrium can substantially increase the parameter space of conditions and materials for which selective gas capture can be effected.« less

  15. Using a Critical Reflection Framework and Collaborative Inquiry to Improve Teaching Practice: An Action Research Project

    ERIC Educational Resources Information Center

    Briscoe, Patricia

    2017-01-01

    This action research reports on a three-year collaborative learning process among three teachers. We used current literature and a critical reflection framework to understand why our teaching approaches were not resulting in increased student learning. This allowed us to examine our previously unrecognized and uninterrupted--and often,…

  16. A regional classification of unregulated stream flows: spatial resolution and hierarchical frameworks.

    Treesearch

    Ryan A. McManamay; Donald J. Orth; Charles A. Dolloff; Emmaneul A. Firmpong

    2012-01-01

    River regulation has resulted in substantial losses in habitat connectivity, biodiversity and ecosystem services. River managers are faced with a growing need to protect the key aspects of the natural flow regime. A practical approach to providing environmental flow standards is to create a regional framework by classifying unregulated streams into groups of similar...

  17. Verbal Working Memory and Language Production: Common Approaches to the Serial Ordering of Verbal Information

    ERIC Educational Resources Information Center

    Acheson, Daniel J.; MacDonald, Maryellen C.

    2009-01-01

    Verbal working memory (WM) tasks typically involve the language production architecture for recall; however, language production processes have had a minimal role in theorizing about WM. A framework for understanding verbal WM results is presented here. In this framework, domain-specific mechanisms for serial ordering in verbal WM are provided by…

  18. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  19. Towards a framework for developing semantic relatedness reference standards.

    PubMed

    Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G

    2011-04-01

    Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.

  20. A stochastically fully connected conditional random field framework for super resolution OCT

    NASA Astrophysics Data System (ADS)

    Boroomand, A.; Tan, B.; Wong, A.; Bizheva, K.

    2017-02-01

    A number of factors can degrade the resolution and contrast of OCT images, such as: (1) changes of the OCT pointspread function (PSF) resulting from wavelength dependent scattering and absorption of light along the imaging depth (2) speckle noise, as well as (3) motion artifacts. We propose a new Super Resolution OCT (SR OCT) imaging framework that takes advantage of a Stochastically Fully Connected Conditional Random Field (SF-CRF) model to generate a Super Resolved OCT (SR OCT) image of higher quality from a set of Low-Resolution OCT (LR OCT) images. The proposed SF-CRF SR OCT imaging is able to simultaneously compensate for all of the factors mentioned above, that degrade the OCT image quality, using a unified computational framework. The proposed SF-CRF SR OCT imaging framework was tested on a set of simulated LR human retinal OCT images generated from a high resolution, high contrast retinal image, and on a set of in-vivo, high resolution, high contrast rat retinal OCT images. The reconstructed SR OCT images show considerably higher spatial resolution, less speckle noise and higher contrast compared to other tested methods. Visual assessment of the results demonstrated the usefulness of the proposed approach in better preservation of fine details and structures of the imaged sample, retaining biological tissue boundaries while reducing speckle noise using a unified computational framework. Quantitative evaluation using both Contrast to Noise Ratio (CNR) and Edge Preservation (EP) parameter also showed superior performance of the proposed SF-CRF SR OCT approach compared to other image processing approaches.

  1. Advances in Landslide Nowcasting: Evaluation of a Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia Bach; Peters-Lidard, Christa; Adler, Robert; Hong, Yang; Kumar, Sujay; Lerner-Lam, Arthur

    2011-01-01

    The increasing availability of remotely sensed data offers a new opportunity to address landslide hazard assessment at larger spatial scales. A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that may experience landslide activity. This system combines a calculation of static landslide susceptibility with satellite-derived rainfall estimates and uses a threshold approach to generate a set of nowcasts that classify potentially hazardous areas. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale near real-time landslide hazard assessment efforts, it requires several modifications before it can be fully realized as an operational tool. This study draws upon a prior work s recommendations to develop a new approach for considering landslide susceptibility and hazard at the regional scale. This case study calculates a regional susceptibility map using remotely sensed and in situ information and a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America. The susceptibility map is evaluated with a regional rainfall intensity duration triggering threshold and results are compared with the global algorithm framework for the same event. Evaluation of this regional system suggests that this empirically based approach provides one plausible way to approach some of the data and resolution issues identified in the global assessment. The presented methodology is straightforward to implement, improves upon the global approach, and allows for results to be transferable between regions. The results also highlight several remaining challenges, including the empirical nature of the algorithm framework and adequate information for algorithm validation. Conclusions suggest that integrating additional triggering factors such as soil moisture may help to improve algorithm performance accuracy. The regional algorithm scenario represents an important step forward in advancing regional and global-scale landslide hazard assessment.

  2. An Multivariate Distance-Based Analytic Framework for Connectome-Wide Association Studies

    PubMed Central

    Shehzad, Zarrar; Kelly, Clare; Reiss, Philip T.; Craddock, R. Cameron; Emerson, John W.; McMahon, Katie; Copland, David A.; Castellanos, F. Xavier; Milham, Michael P.

    2014-01-01

    The identification of phenotypic associations in high-dimensional brain connectivity data represents the next frontier in the neuroimaging connectomics era. Exploration of brain-phenotype relationships remains limited by statistical approaches that are computationally intensive, depend on a priori hypotheses, or require stringent correction for multiple comparisons. Here, we propose a computationally efficient, data-driven technique for connectome-wide association studies (CWAS) that provides a comprehensive voxel-wise survey of brain-behavior relationships across the connectome; the approach identifies voxels whose whole-brain connectivity patterns vary significantly with a phenotypic variable. Using resting state fMRI data, we demonstrate the utility of our analytic framework by identifying significant connectivity-phenotype relationships for full-scale IQ and assessing their overlap with existent neuroimaging findings, as synthesized by openly available automated meta-analysis (www.neurosynth.org). The results appeared to be robust to the removal of nuisance covariates (i.e., mean connectivity, global signal, and motion) and varying brain resolution (i.e., voxelwise results are highly similar to results using 800 parcellations). We show that CWAS findings can be used to guide subsequent seed-based correlation analyses. Finally, we demonstrate the applicability of the approach by examining CWAS for three additional datasets, each encompassing a distinct phenotypic variable: neurotypical development, Attention-Deficit/Hyperactivity Disorder diagnostic status, and L-dopa pharmacological manipulation. For each phenotype, our approach to CWAS identified distinct connectome-wide association profiles, not previously attainable in a single study utilizing traditional univariate approaches. As a computationally efficient, extensible, and scalable method, our CWAS framework can accelerate the discovery of brain-behavior relationships in the connectome. PMID:24583255

  3. A Component-Based FPGA Design Framework for Neuronal Ion Channel Dynamics Simulations

    PubMed Central

    Mak, Terrence S. T.; Rachmuth, Guy; Lam, Kai-Pui; Poon, Chi-Sang

    2008-01-01

    Neuron-machine interfaces such as dynamic clamp and brain-implantable neuroprosthetic devices require real-time simulations of neuronal ion channel dynamics. Field Programmable Gate Array (FPGA) has emerged as a high-speed digital platform ideal for such application-specific computations. We propose an efficient and flexible component-based FPGA design framework for neuronal ion channel dynamics simulations, which overcomes certain limitations of the recently proposed memory-based approach. A parallel processing strategy is used to minimize computational delay, and a hardware-efficient factoring approach for calculating exponential and division functions in neuronal ion channel models is used to conserve resource consumption. Performances of the various FPGA design approaches are compared theoretically and experimentally in corresponding implementations of the AMPA and NMDA synaptic ion channel models. Our results suggest that the component-based design framework provides a more memory economic solution as well as more efficient logic utilization for large word lengths, whereas the memory-based approach may be suitable for time-critical applications where a higher throughput rate is desired. PMID:17190033

  4. A discrete mechanics framework for real time virtual surgical simulations with application to virtual laparoscopic nephrectomy.

    PubMed

    Zhou, Xiangmin; Zhang, Nan; Sha, Desong; Shen, Yunhe; Tamma, Kumar K; Sweet, Robert

    2009-01-01

    The inability to render realistic soft-tissue behavior in real time has remained a barrier to face and content aspects of validity for many virtual reality surgical training systems. Biophysically based models are not only suitable for training purposes but also for patient-specific clinical applications, physiological modeling and surgical planning. When considering the existing approaches for modeling soft tissue for virtual reality surgical simulation, the computer graphics-based approach lacks predictive capability; the mass-spring model (MSM) based approach lacks biophysically realistic soft-tissue dynamic behavior; and the finite element method (FEM) approaches fail to meet the real-time requirement. The present development stems from physics fundamental thermodynamic first law; for a space discrete dynamic system directly formulates the space discrete but time continuous governing equation with embedded material constitutive relation and results in a discrete mechanics framework which possesses a unique balance between the computational efforts and the physically realistic soft-tissue dynamic behavior. We describe the development of the discrete mechanics framework with focused attention towards a virtual laparoscopic nephrectomy application.

  5. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  6. Vulnerability assessment of water resources - Translating a theoretical concept to an operational framework using systems thinking approach in a changing climate: Case study in Ogallala Aquifer

    NASA Astrophysics Data System (ADS)

    Anandhi, Aavudai; Kannan, Narayanan

    2018-02-01

    Water is an essential natural resource. Among many stressors, altered climate is exerting pressure on water resource systems, increasing its demand and creating a need for vulnerability assessments. The overall objective of this study was to develop a novel tool that can translate a theoretical concept (vulnerability of water resources (VWR)) to an operational framework mainly under altered temperature and precipitation, as well as for population change (smaller extent). The developed tool had three stages and utilized a novel systems thinking approach. Stage-1: Translating theoretical concept to characteristics identified from studies; Stage-2: Operationalizing characteristics to methodology in VWR; Stage-3: Utilizing the methodology for development of a conceptual modeling tool for VWR: WR-VISTA (Water Resource Vulnerability assessment conceptual model using Indicators selected by System's Thinking Approach). The specific novelties were: 1) The important characteristics in VWR were identified in Stage-1 (target system, system components, scale, level of detail, data source, frameworks, and indicator); 2) WR-VISTA combined two vulnerability assessments frameworks: the European's Driver-Pressure-State-Impact-Response framework (DPSIR) and the Intergovernmental Panel on Climate Change's framework (IPCC's); and 3) used systems thinking approaches in VWR for indicator selection. The developed application was demonstrated in Kansas (overlying the High Plains region/Ogallala Aquifer, considered the "breadbasket of the world"), using 26 indicators with intermediate level of detail. Our results indicate that the western part of the state is vulnerable from agricultural water use and the eastern part from urban water use. The developed tool can be easily replicated to other regions within and outside the US.

  7. path integral approach to closed form pricing formulas in the Heston framework.

    NASA Astrophysics Data System (ADS)

    Lemmens, Damiaan; Wouters, Michiel; Tempere, Jacques; Foulon, Sven

    2008-03-01

    We present a path integral approach for finding closed form formulas for option prices in the framework of the Heston model. The first model for determining option prices was the Black-Scholes model, which assumed that the logreturn followed a Wiener process with a given drift and constant volatility. To provide a realistic description of the market, the Black-Scholes results must be extended to include stochastic volatility. This is achieved by the Heston model, which assumes that the volatility follows a mean reverting square root process. Current applications of the Heston model are hampered by the unavailability of fast numerical methods, due to a lack of closed-form formulae. Therefore the search for closed form solutions is an essential step before the qualitatively better stochastic volatility models will be used in practice. To attain this goal we outline a simplified path integral approach yielding straightforward results for vanilla Heston options with correlation. Extensions to barrier options and other path-dependent option are discussed, and the new derivation is compared to existing results obtained from alternative path-integral approaches (Dragulescu, Kleinert).

  8. A Comparative Study of Strategic HRD Approaches for Workforce Planning in the Tourism Industry

    ERIC Educational Resources Information Center

    Bartlett, Kenneth; Johnson, Karen; Schneider, Ingrid E.

    2006-01-01

    This study compares the outcomes of two often used approaches for strategic HRD planning. Using methods framed within a strategic HRD planning framework the outcomes of a qualitative primary data approach are examined against quantitative labor market projections in a study of the future Minnesota tourism workforce. Results show each planning…

  9. Work-Centered Approach to Insurgency Campaign Analysis

    DTIC Science & Technology

    2007-06-01

    a constructivist or sensemaking philosophy by defining data, information , situation awareness , and situation understanding in the following manner...present paper explores a new approach to understanding transnational insurgency movements –an approach based on a fundamental analysis of the knowledge ...country or region. By focusing at the fundamental level of knowledge creation, the resulting framework allows an understanding of insurgency

  10. Sampling-based ensemble segmentation against inter-operator variability

    NASA Astrophysics Data System (ADS)

    Huo, Jing; Okada, Kazunori; Pope, Whitney; Brown, Matthew

    2011-03-01

    Inconsistency and a lack of reproducibility are commonly associated with semi-automated segmentation methods. In this study, we developed an ensemble approach to improve reproducibility and applied it to glioblastoma multiforme (GBM) brain tumor segmentation on T1-weigted contrast enhanced MR volumes. The proposed approach combines samplingbased simulations and ensemble segmentation into a single framework; it generates a set of segmentations by perturbing user initialization and user-specified internal parameters, then fuses the set of segmentations into a single consensus result. Three combination algorithms were applied: majority voting, averaging and expectation-maximization (EM). The reproducibility of the proposed framework was evaluated by a controlled experiment on 16 tumor cases from a multicenter drug trial. The ensemble framework had significantly better reproducibility than the individual base Otsu thresholding method (p<.001).

  11. SysSon - A Framework for Systematic Sonification Design

    NASA Astrophysics Data System (ADS)

    Vogt, Katharina; Goudarzi, Visda; Holger Rutz, Hanns

    2015-04-01

    SysSon is a research approach on introducing sonification systematically to a scientific community where it is not yet commonly used - e.g., in climate science. Thereby, both technical and socio-cultural barriers have to be met. The approach was further developed with climate scientists, who participated in contextual inquiries, usability tests and a workshop of collaborative design. Following from these extensive user tests resulted our final software framework. As frontend, a graphical user interface allows climate scientists to parametrize standard sonifications with their own data sets. Additionally, an interactive shell allows to code new sonifications for users competent in sound design. The framework is a standalone desktop application, available as open source (for details see http://sysson.kug.ac.at/) and works with data in NetCDF format.

  12. Designing effective human-automation-plant interfaces: a control-theoretic perspective.

    PubMed

    Jamieson, Greg A; Vicente, Kim J

    2005-01-01

    In this article, we propose the application of a control-theoretic framework to human-automation interaction. The framework consists of a set of conceptual distinctions that should be respected in automation research and design. We demonstrate how existing automation interface designs in some nuclear plants fail to recognize these distinctions. We further show the value of the approach by applying it to modes of automation. The design guidelines that have been proposed in the automation literature are evaluated from the perspective of the framework. This comparison shows that the framework reveals insights that are frequently overlooked in this literature. A new set of design guidelines is introduced that builds upon the contributions of previous research and draws complementary insights from the control-theoretic framework. The result is a coherent and systematic approach to the design of human-automation-plant interfaces that will yield more concrete design criteria and a broader set of design tools. Applications of this research include improving the effectiveness of human-automation interaction design and the relevance of human-automation interaction research.

  13. A framework for automatic information quality ranking of diabetes websites.

    PubMed

    Belen Sağlam, Rahime; Taskaya Temizel, Tugba

    2015-01-01

    Objective: When searching for particular medical information on the internet the challenge lies in distinguishing the websites that are relevant to the topic, and contain accurate information. In this article, we propose a framework that automatically identifies and ranks diabetes websites according to their relevance and information quality based on the website content. Design: The proposed framework ranks diabetes websites according to their content quality, relevance and evidence based medicine. The framework combines information retrieval techniques with a lexical resource based on Sentiwordnet making it possible to work with biased and untrusted websites while, at the same time, ensuring the content relevance. Measurement: The evaluation measurements used were Pearson-correlation, true positives, false positives and accuracy. We tested the framework with a benchmark data set consisting of 55 websites with varying degrees of information quality problems. Results: The proposed framework gives good results that are comparable with the non-automated information quality measuring approaches in the literature. The correlation between the results of the proposed automated framework and ground-truth is 0.68 on an average with p < 0.001 which is greater than the other proposed automated methods in the literature (r score in average is 0.33).

  14. Examining the Value of a Scaffolded Critique Framework to Promote Argumentative and Explanatory Writings Within an Argument-Based Inquiry Approach

    NASA Astrophysics Data System (ADS)

    Jang, Jeong-yoon; Hand, Brian

    2017-12-01

    This study investigated the value of using a scaffolded critique framework to promote two different types of writing—argumentative writing and explanatory writing—with different purposes within an argument-based inquiry approach known as the Science Writing Heuristic (SWH) approach. A quasi-experimental design with sixth and seventh grade students taught by two teachers was used. A total of 170 students participated in the study, with 87 in the control group (four classes) and 83 in the treatment group (four classes). All students used the SWH templates as an argumentative writing to guide their written work and completed these templates during the SWH investigations of each unit. After completing the SWH investigations, both groups of students were asked to complete the summary writing task as an explanatory writing at the end of each unit. All students' writing samples were scored using analytical frameworks developed for the study. The results indicated that the treatment group performed significantly better on the explanatory writing task than the control group. In addition, the results of the partial correlation suggested that there is a very strong significantly positive relationship between the argumentative writing and the explanatory writing.

  15. Causal nexus between energy consumption and carbon dioxide emission for Malaysia using maximum entropy bootstrap approach.

    PubMed

    Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid

    2015-12-01

    This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.

  16. Modelling Lean and Green Supply Chain

    NASA Astrophysics Data System (ADS)

    Duarte, Susana Carla Vieira Lino Medina

    The success of an organization depends on the effective control of its supply chain. It is important to recognize new opportunities for organization and its supply chain. In the last few years the approach to lean, agile, resilient and green supply chain paradigms has been addressed in the scientific literature. Research in this field shows that the integration of these concepts revealed some contradictions among so many paradigms. This thesis is mainly focused on the lean and green approaches. Thirteen different management frameworks, embodied in awards, standards and tools were studied to understand if they could contribute for the modelling process of a lean and green approach. The study reveals a number of categories that are common in most management frameworks, providing adequate conditions for a lean and green supply chain transformation. A conceptual framework for the evaluation of a lean and green organization`s supply chain was proposed. The framework considers six key criteria, namely, leadership, people, strategic planning, stakeholders, processes and results. It was proposed an assessment method considering a criteria score for each criterion. The purpose is to understand how lean and green supply chain can be compatible, using principles, practices, techniques or tools (i.e. elements) that support both, a lean and a green approach, in all key criteria. A case study in the automotive upstream supply chain was performed to understand more deeply if the elements proposed for the conceptual framework could be implemented in a real-scenario. Based on the conceptual framework and the case study, a roadmap to achieve a lean-green transformation is presented. The proposed roadmap revealed its contribution to the understanding on how and when an organization`s supply chain should apply the lean and green elements. This study is relevant to practice, as it may assist managers in the adoption of a lean and green supply chain approach, giving insights for the implementation of a hybrid supply chain.

  17. A highly efficient approach to protein interactome mapping based on collaborative filtering framework.

    PubMed

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-09

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  18. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    PubMed Central

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661

  19. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    NASA Astrophysics Data System (ADS)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  20. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    USGS Publications Warehouse

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.

  1. How to evaluate population management? Transforming the Care Continuum Alliance population health guide toward a broadly applicable analytical framework.

    PubMed

    Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A

    2015-04-01

    Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Modeling framework for representing long-term effectiveness of best management practices in addressing hydrology and water quality problems: Framework development and demonstraton using a Bayesian method

    USDA-ARS?s Scientific Manuscript database

    Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water q...

  3. Towards a Research Framework for Race in Education: Critical Race Theory and Judith Butler

    ERIC Educational Resources Information Center

    Chadderton, Charlotte

    2013-01-01

    There has been much debate around the extent to which post-structuralist theory can be applied to critical research. In this article, it is argued that aspects of the two approaches can be combined, resulting in productive tensions that point towards a possible new framework for researching race and racism in education in the UK. The article…

  4. Conceptual Change from the Framework Theory Side of the Fence

    ERIC Educational Resources Information Center

    Vosniadou, Stella; Skopeliti, Irini

    2014-01-01

    We describe the main principles of the framework theory approach to conceptual change and briefly report on the results of a text comprehension study that investigated some of the hypotheses that derive from it. We claim that children construct a naive physics which is based on observation in the context of lay culture and which forms a relatively…

  5. A Comparative Investigation of TPB and Altruism Frameworks for an Empirically Based Communication Approach to Enhance Paper Recycling

    ERIC Educational Resources Information Center

    Chaisamrej, Rungrat; Zimmerman, Rick S.

    2014-01-01

    This research compared the ability of the theory of planned behavior (TPB) and the altruism framework (AM) to predict paper-recycling behavior. It was comprised of formative research and a major survey. Data collected from 628 undergraduate students in Thailand were analyzed using structural equation modeling. Results showed that TPB was superior…

  6. From a Politics of Dilemmas to a Politics of Paradoxes: Feminism, Pedagogy, and Women's Leadership for Social Change

    ERIC Educational Resources Information Center

    Kark, Ronit; Preser, Ruth; Zion-Waldoks, Tanya

    2016-01-01

    Transformational learning is a process resulting in deep and significant change in habitual patterns of identity, thought, emotion, and action, enabling new approaches to role enactment. This article explores how moving from a framework of dilemmas, which require solutions and one-sided choices, to a framework of paradoxes that embraces tensions…

  7. A Converse Approach to NMR Chemical Shifts for Norm-Conserving Pseudopotentials

    NASA Astrophysics Data System (ADS)

    Lopez, Graham; Ceresoli, Davide; Marzari, Nicola; Thonhauser, Timo

    2010-03-01

    Building on the recently developed converse approach for the ab-initio calculation of NMR chemical shifts [1], we present a corresponding framework that is suitable in connection with norm-conserving pseudopotentials. Our approach uses the GIPAW transformation [2] to set up a formalism where the derivative of the orbital magnetization [3] is taken with respect to a microscopic, localized magnetic dipole in the presence of pseudopotentials. The advantages of our method are that it is conceptually simple, the need for a linear-response framework is avoided, and it is applicable to large systems. We present results for calculations of several well-studied systems, including the carbon, hydrogen, fluorine, and phosphorus shifts in various molecules and solids. Our results are in very good agreement with both linear-response calculations and experimental results.[4pt] [1] T. Thonhauser et al., J. Chem. Phys. 131, 101101 (2009).[2] C. J. Pickard and F. Mauri, Phys. Rev. B 63, 245101 (2001).[3] T. Thonhauser et al., Phys. Rev. Lett. 95, 137205 (2005).

  8. The Effect of Visual Information on the Manual Approach and Landing

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1982-01-01

    The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.

  9. Parenting Styles and Adolescents’ School Adjustment: Investigating the Mediating Role of Achievement Goals within the 2 × 2 Framework

    PubMed Central

    Xiang, Shiyuan; Liu, Yan; Bai, Lu

    2017-01-01

    This study examines the multiple mediating roles of achievement goals based on a 2 × 2 framework of the relationships between parenting styles and adolescents’ school adjustment. The study sample included 1061 Chinese adolescent students (50.4% girls) between the ages of 12 and 19, who completed questionnaires regarding parenting styles (parental autonomy support and psychological control), achievement goals (mastery approach, mastery avoidance, performance approach, and performance avoidance goals) and school adjustment variables (emotion, students’ life satisfaction, school self-esteem, problem behavior, academic achievement, and self-determination in school). A structural equation modeling (SEM) approach was used to test our hypotheses. The results indicated that parental autonomy support was associated with adolescents’ school adjustment in an adaptive manner, both directly and through its positive relationship with both mastery and performance approach goals; however, parental psychological control was associated with adolescents’ school adjustment in a maladaptive manner, both directly and through its positive relationship with both mastery and performance avoidance goals. In addition, the results indicated that mastery avoidance goals suppressed the relationship between parental autonomy support and adolescents’ school adjustment, and performance approach goals suppressed the relationship between this adjustment and parental psychological control. These findings extend the limited literature regarding the 2 × 2 framework of achievement goals and enable us to evidence the mediating and suppressing effects of achievement goals. This study highlights the importance of parenting in adolescents’ school adjustment through the cultivation of different achievement goals. PMID:29085321

  10. Parenting Styles and Adolescents' School Adjustment: Investigating the Mediating Role of Achievement Goals within the 2 × 2 Framework.

    PubMed

    Xiang, Shiyuan; Liu, Yan; Bai, Lu

    2017-01-01

    This study examines the multiple mediating roles of achievement goals based on a 2 × 2 framework of the relationships between parenting styles and adolescents' school adjustment. The study sample included 1061 Chinese adolescent students (50.4% girls) between the ages of 12 and 19, who completed questionnaires regarding parenting styles (parental autonomy support and psychological control), achievement goals (mastery approach, mastery avoidance, performance approach, and performance avoidance goals) and school adjustment variables (emotion, students' life satisfaction, school self-esteem, problem behavior, academic achievement, and self-determination in school). A structural equation modeling (SEM) approach was used to test our hypotheses. The results indicated that parental autonomy support was associated with adolescents' school adjustment in an adaptive manner, both directly and through its positive relationship with both mastery and performance approach goals; however, parental psychological control was associated with adolescents' school adjustment in a maladaptive manner, both directly and through its positive relationship with both mastery and performance avoidance goals. In addition, the results indicated that mastery avoidance goals suppressed the relationship between parental autonomy support and adolescents' school adjustment, and performance approach goals suppressed the relationship between this adjustment and parental psychological control. These findings extend the limited literature regarding the 2 × 2 framework of achievement goals and enable us to evidence the mediating and suppressing effects of achievement goals. This study highlights the importance of parenting in adolescents' school adjustment through the cultivation of different achievement goals.

  11. Combining machine learning and matching techniques to improve causal inference in program evaluation.

    PubMed

    Linden, Ariel; Yarnold, Paul R

    2016-12-01

    Program evaluations often utilize various matching approaches to emulate the randomization process for group assignment in experimental studies. Typically, the matching strategy is implemented, and then covariate balance is assessed before estimating treatment effects. This paper introduces a novel analytic framework utilizing a machine learning algorithm called optimal discriminant analysis (ODA) for assessing covariate balance and estimating treatment effects, once the matching strategy has been implemented. This framework holds several key advantages over the conventional approach: application to any variable metric and number of groups; insensitivity to skewed data or outliers; and use of accuracy measures applicable to all prognostic analyses. Moreover, ODA accepts analytic weights, thereby extending the methodology to any study design where weights are used for covariate adjustment or more precise (differential) outcome measurement. One-to-one matching on the propensity score was used as the matching strategy. Covariate balance was assessed using standardized difference in means (conventional approach) and measures of classification accuracy (ODA). Treatment effects were estimated using ordinary least squares regression and ODA. Using empirical data, ODA produced results highly consistent with those obtained via the conventional methodology for assessing covariate balance and estimating treatment effects. When ODA is combined with matching techniques within a treatment effects framework, the results are consistent with conventional approaches. However, given that it provides additional dimensions and robustness to the analysis versus what can currently be achieved using conventional approaches, ODA offers an appealing alternative. © 2016 John Wiley & Sons, Ltd.

  12. A unified framework for mesh refinement in random and physical space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jing; Stinis, Panos

    In recent work we have shown how an accurate reduced model can be utilized to perform mesh renement in random space. That work relied on the explicit knowledge of an accurate reduced model which is used to monitor the transfer of activity from the large to the small scales of the solution. Since this is not always available, we present in the current work a framework which shares the merits and basic idea of the previous approach but does not require an explicit knowledge of a reduced model. Moreover, the current framework can be applied for renement in both randommore » and physical space. In this manuscript we focus on the application to random space mesh renement. We study examples of increasing difficulty (from ordinary to partial differential equations) which demonstrate the effciency and versatility of our approach. We also provide some results from the application of the new framework to physical space mesh refinement.« less

  13. A Columnar Storage Strategy with Spatiotemporal Index for Big Climate Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Bowen, M. K.; Li, Z.; Schnase, J. L.; Duffy, D.; Lee, T. J.; Yang, C. P.

    2015-12-01

    Large collections of observational, reanalysis, and climate model output data may grow to as large as a 100 PB in the coming years, so climate dataset is in the Big Data domain, and various distributed computing frameworks have been utilized to address the challenges by big climate data analysis. However, due to the binary data format (NetCDF, HDF) with high spatial and temporal dimensions, the computing frameworks in Apache Hadoop ecosystem are not originally suited for big climate data. In order to make the computing frameworks in Hadoop ecosystem directly support big climate data, we propose a columnar storage format with spatiotemporal index to store climate data, which will support any project in the Apache Hadoop ecosystem (e.g. MapReduce, Spark, Hive, Impala). With this approach, the climate data will be transferred into binary Parquet data format, a columnar storage format, and spatial and temporal index will be built and attached into the end of Parquet files to enable real-time data query. Then such climate data in Parquet data format could be available to any computing frameworks in Hadoop ecosystem. The proposed approach is evaluated using the NASA Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. Experimental results show that this approach could efficiently overcome the gap between the big climate data and the distributed computing frameworks, and the spatiotemporal index could significantly accelerate data querying and processing.

  14. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.

  15. Knowledge brokering for healthy aging: a scoping review of potential approaches.

    PubMed

    Van Eerd, Dwayne; Newman, Kristine; DeForge, Ryan; Urquhart, Robin; Cornelissen, Evelyn; Dainty, Katie N

    2016-10-19

    Developing a healthcare delivery system that is more responsive to the future challenges of an aging population is a priority in Canada. The World Health Organization acknowledges the need for knowledge translation frameworks in aging and health. Knowledge brokering (KB) is a specific knowledge translation approach that includes making connections between people to facilitate the use of evidence. Knowledge gaps exist about KB roles, approaches, and guiding frameworks. The objective of the scoping review is to identify and describe KB approaches and the underlying conceptual frameworks (models, theories) used to guide the approaches that could support healthy aging. Literature searches were done in PubMed, EMBASE, PsycINFO, EBM reviews (Cochrane Database of systematic reviews), CINAHL, and SCOPUS, as well as Google and Google Scholar using terms related to knowledge brokering. Titles, abstracts, and full reports were reviewed independently by two reviewers who came to consensus on all screening criteria. Documents were included if they described a KB approach and details about the underlying conceptual basis. Data about KB approach, target stakeholders, KB outcomes, and context were extracted independently by two reviewers. Searches identified 248 unique references. Screening for inclusion revealed 19 documents that described 15 accounts of knowledge brokering and details about conceptual guidance and could be applied in healthy aging contexts. Eight KB elements were detected in the approaches though not all approaches incorporated all elements. The underlying conceptual guidance for KB approaches varied. Specific KB frameworks were referenced or developed for nine KB approaches while the remaining six cited more general KT frameworks (or multiple frameworks) as guidance. The KB approaches that we found varied greatly depending on the context and stakeholders involved. Three of the approaches were explicitly employed in the context of health aging. Common elements of KB approaches that could be conducted in healthy aging contexts focussed on acquiring, adapting, and disseminating knowledge and networking (linkage). The descriptions of the guiding conceptual frameworks (theories, models) focussed on linkage and exchange but varied across approaches. Future research should gather KB practitioner and stakeholder perspectives on effective practices to develop KB approaches for healthy aging.

  16. Conceptual framework for drought phenotyping during molecular breeding.

    PubMed

    Salekdeh, Ghasem Hosseini; Reynolds, Matthew; Bennett, John; Boyer, John

    2009-09-01

    Drought is a major threat to agricultural production and drought tolerance is a prime target for molecular approaches to crop improvement. To achieve meaningful results, these approaches must be linked with suitable phenotyping protocols at all stages, such as the screening of germplasm collections, mutant libraries, mapping populations, transgenic lines and breeding materials and the design of OMICS and quantitative trait loci (QTLs) experiments. Here we present a conceptual framework for molecular breeding for drought tolerance based on the Passioura equation of expressing yield as the product of water use (WU), water use efficiency (WUE) and harvest index (HI). We identify phenotyping protocols that address each of these factors, describe their key features and illustrate their integration with different molecular approaches.

  17. Adapting a Framework for Assessing Students' Approaches to Modeling

    ERIC Educational Resources Information Center

    Bennett, Steven Carl

    2017-01-01

    We used an "approach to learning" theoretical framework to explicate the ways students engage in scientific modeling. Approach to learning theory suggests that when students approach learning deeply, they link science concepts with prior knowledge and experiences. Conversely, when students engage in a surface approach to learning, they…

  18. Use of Annotations for Component and Framework Interoperability

    NASA Astrophysics Data System (ADS)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.

  19. An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study

    PubMed Central

    McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim

    2018-01-01

    Background The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. Objective This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. Methods A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Results Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. Conclusions The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development programs. PMID:29764794

  20. A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems.

    PubMed

    Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong

    2015-02-01

    Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.

  1. Building capacity for evidence generation, synthesis and implementation to improve the care of mothers and babies in South East Asia: methods and design of the SEA-ORCHID Project using a logical framework approach

    PubMed Central

    2010-01-01

    Background Rates of maternal and perinatal mortality remain high in developing countries despite the existence of effective interventions. Efforts to strengthen evidence-based approaches to improve health in these settings are partly hindered by restricted access to the best available evidence, limited training in evidence-based practice and concerns about the relevance of existing evidence. South East Asia - Optimising Reproductive and Child Health in Developing Countries (SEA-ORCHID) was a five-year project that aimed to determine whether a multifaceted intervention designed to strengthen the capacity for research synthesis, evidence-based care and knowledge implementation improved clinical practice and led to better health outcomes for mothers and babies. This paper describes the development and design of the SEA-ORCHID intervention plan using a logical framework approach. Methods SEA-ORCHID used a before-and-after design to evaluate the impact of a multifaceted tailored intervention at nine sites across Thailand, Malaysia, Philippines and Indonesia, supported by three centres in Australia. We used a logical framework approach to systematically prepare and summarise the project plan in a clear and logical way. The development and design of the SEA-ORCHID project was based around the three components of a logical framework (problem analysis, project plan and evaluation strategy). Results The SEA-ORCHID logical framework defined the project's goal and purpose (To improve the health of mothers and babies in South East Asia and To improve clinical practice in reproductive health in South East Asia), and outlined a series of project objectives and activities designed to achieve these. The logical framework also established outcome and process measures appropriate to each level of the project plan, and guided project work in each of the participating countries and hospitals. Conclusions Development of a logical framework in the SEA-ORCHID project enabled a reasoned, logical approach to the project design that ensured the project activities would achieve the desired outcomes and that the evaluation plan would assess both the process and outcome of the project. The logical framework was also valuable over the course of the project to facilitate communication, assess progress and build a shared understanding of the project activities, purpose and goal. PMID:20594325

  2. Predicting who will major in a science discipline: Expectancy-value theory as part of an ecological model for studying academic communities

    NASA Astrophysics Data System (ADS)

    Sullins, Ellen S.; Hernandez, Delia; Fuller, Carol; Shiro Tashiro, Jay

    Research on factors that shape recruitment and retention in undergraduate science majors currently is highly fragmented and in need of an integrative research framework. Such a framework should incorporate analyses of the various levels of organization that characterize academic communities (i.e., the broad institutional level, the departmental level, and the student level), and should also provide ways to study the interactions occurring within and between these structural levels. We propose that academic communities are analogous to ecosystems, and that the research paradigms of modern community ecology can provide the necessary framework, as well as new and innovative approaches to a very complex area. This article also presents the results of a pilot study that demonstrates the promise of this approach at the student level. We administered a questionnaire based on expectancy-value theory to undergraduates enrolled in introductory biology courses. Itself an integrative approach, expectancy-value theory views achievement-related behavior as a joint function of the person's expectancy of success in the behavior and the subjective value placed on such success. Our results indicated: (a) significant gender differences in the underlying factor structures of expectations and values related to the discipline of biology, (b) expectancy-value factors significantly distinguished biology majors from nonmajors, and (c) expectancy-value factors significantly predicted students' intent to enroll in future biology courses. We explore the expectancy-value framework as an operationally integrative framework in our ecological model for studying academic communities, especially in the context of assessing the underrepresentation of women and minorities in the sciences. Future research directions as well as practical implications are also discussed.

  3. A hierarchical approach for simulating northern forest dynamics

    Treesearch

    Don C. Bragg; David W. Roberts; Thomas R. Crow

    2004-01-01

    Complexity in ecological systems has challenged forest simulation modelers for years, resulting in a number of approaches with varying degrees of success. Arguments in favor of hierarchical modeling are made, especially for considering a complex environmental issue like widespread eastern hemlock regeneration failure. We present the philosophy and basic framework for...

  4. Progress and Accountability in Family Literacy: Lessons from a Collaborative Approach.

    ERIC Educational Resources Information Center

    Ryan, Katherine E.; And Others

    1996-01-01

    The implementation of a collaborative approach to evaluating family literacy programs was studied using a conceptual framework and applied to 36 family literacy programs from a midwestern state. Evaluation participants learned how results could be used to develop curriculum in addition to reporting to funding agencies. (SLD)

  5. Decerns: A framework for multi-criteria decision analysis

    DOE PAGES

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  6. Generic framework for the secure Yuen 2000 quantum-encryption protocol employing the wire-tap channel approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mihaljevic, Miodrag J.

    2007-05-15

    It is shown that the security, against known-plaintext attacks, of the Yuen 2000 (Y00) quantum-encryption protocol can be considered via the wire-tap channel model assuming that the heterodyne measurement yields the sample for security evaluation. Employing the results reported on the wire-tap channel, a generic framework is proposed for developing secure Y00 instantiations. The proposed framework employs a dedicated encoding which together with inherent quantum noise at the attacker's side provides Y00 security.

  7. Adapting the balanced scorecard for mental health and addictions: an inpatient example.

    PubMed

    Lin, Elizabeth; Durbin, Janet

    2008-05-01

    The Balanced Scorecard (BSC) is a performance-monitoring framework that originated in the business sector but has more recently been applied to health services. The province of Ontario is using the BSC approach to monitor quality of inpatient care in five service areas. Feasibility of the scorecard framework for each area has been assessed using a standard approach. This paper reports results of the feasibility study for the mental health sector, focusing on three issues: framework relevance, underlying strategic goals and indicator selection. Based on a literature review and extensive stakeholder input, the BSC quadrant structure was recommended with some modifications, and indicators were selected that aligned with provincial mental health reform policy goals. The mental health report has completed two cycles of reporting, and has received good support from the field. Copyright © 2008 Longwoods Publishing.

  8. A Kernel-Based Low-Rank (KLR) Model for Low-Dimensional Manifold Recovery in Highly Accelerated Dynamic MRI.

    PubMed

    Nakarmi, Ukash; Wang, Yanhua; Lyu, Jingyuan; Liang, Dong; Ying, Leslie

    2017-11-01

    While many low rank and sparsity-based approaches have been developed for accelerated dynamic magnetic resonance imaging (dMRI), they all use low rankness or sparsity in input space, overlooking the intrinsic nonlinear correlation in most dMRI data. In this paper, we propose a kernel-based framework to allow nonlinear manifold models in reconstruction from sub-Nyquist data. Within this framework, many existing algorithms can be extended to kernel framework with nonlinear models. In particular, we have developed a novel algorithm with a kernel-based low-rank model generalizing the conventional low rank formulation. The algorithm consists of manifold learning using kernel, low rank enforcement in feature space, and preimaging with data consistency. Extensive simulation and experiment results show that the proposed method surpasses the conventional low-rank-modeled approaches for dMRI.

  9. a Conceptual Framework for Indoor Mapping by Using Grammars

    NASA Astrophysics Data System (ADS)

    Hu, X.; Fan, H.; Zipf, A.; Shang, J.; Gu, F.

    2017-09-01

    Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users' location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.

  10. The Principle of the Fermionic Projector: An Approach for Quantum Gravity?

    NASA Astrophysics Data System (ADS)

    Finster, Felix

    In this short article we introduce the mathematical framework of the principle of the fermionic projector and set up a variational principle in discrete space-time. The underlying physical principles are discussed. We outline the connection to the continuum theory and state recent results. In the last two sections, we speculate on how it might be possible to describe quantum gravity within this framework.

  11. Wave particle duality, the observer and retrocausality

    NASA Astrophysics Data System (ADS)

    Narasimhan, Ashok; Kafatos, Menas C.

    2017-05-01

    We approach wave particle duality, the role of the observer and implications on Retrocausality, by starting with the results of a well verified quantum experiment. We analyze how some current theoretical approaches interpret these results. We then provide an alternative theoretical framework that is consistent with the observations and in many ways simpler than usual attempts to account for retrocausality, involving a non-local conscious Observer.

  12. A Framework of Working Across Disciplines in Early Design and R&D of Large Complex Engineered Systems

    NASA Technical Reports Server (NTRS)

    McGowan, Anna-Maria Rivas; Papalambros, Panos Y.; Baker, Wayne E.

    2015-01-01

    This paper examines four primary methods of working across disciplines during R&D and early design of large-scale complex engineered systems such as aerospace systems. A conceptualized framework, called the Combining System Elements framework, is presented to delineate several aspects of cross-discipline and system integration practice. The framework is derived from a theoretical and empirical analysis of current work practices in actual operational settings and is informed by theories from organization science and engineering. The explanatory framework may be used by teams to clarify assumptions and associated work practices, which may reduce ambiguity in understanding diverse approaches to early systems research, development and design. The framework also highlights that very different engineering results may be obtained depending on work practices, even when the goals for the engineered system are the same.

  13. Comparability of outcome frameworks in medical education: Implications for framework development.

    PubMed

    Hautz, Stefanie C; Hautz, Wolf E; Feufel, Markus A; Spies, Claudia D

    2015-01-01

    Given the increasing mobility of medical students and practitioners, there is a growing need for harmonization of medical education and qualifications. Although several initiatives have sought to compare national outcome frameworks, this task has proven a challenge. Drawing on an analysis of existing outcome frameworks, we identify factors that hinder comparability and suggest ways of facilitating comparability during framework development and revisions. We searched MedLine, EmBase and the Internet for outcome frameworks in medical education published by national or governmental organizations. We analyzed these frameworks for differences and similarities that influence comparability. Of 1816 search results, 13 outcome frameworks met our inclusion criteria. These frameworks differ in five core features: history and origins, formal structure, medical education system, target audience and key terms. Many frameworks reference other frameworks without acknowledging these differences. Importantly, the level of detail of the outcomes specified differs both within and between frameworks. The differences identified explain some of the challenges involved in comparing outcome frameworks and medical qualifications. We propose a two-level model distinguishing between "core" competencies and culture-specific "secondary" competencies. This approach could strike a balance between local specifics and cross-national comparability of outcome frameworks and medical education.

  14. Multi-atlas learner fusion: An efficient segmentation approach for large-scale data.

    PubMed

    Asman, Andrew J; Huo, Yuankai; Plassard, Andrew J; Landman, Bennett A

    2015-12-01

    We propose multi-atlas learner fusion (MLF), a framework for rapidly and accurately replicating the highly accurate, yet computationally expensive, multi-atlas segmentation framework based on fusing local learners. In the largest whole-brain multi-atlas study yet reported, multi-atlas segmentations are estimated for a training set of 3464 MR brain images. Using these multi-atlas estimates we (1) estimate a low-dimensional representation for selecting locally appropriate example images, and (2) build AdaBoost learners that map a weak initial segmentation to the multi-atlas segmentation result. Thus, to segment a new target image we project the image into the low-dimensional space, construct a weak initial segmentation, and fuse the trained, locally selected, learners. The MLF framework cuts the runtime on a modern computer from 36 h down to 3-8 min - a 270× speedup - by completely bypassing the need for deformable atlas-target registrations. Additionally, we (1) describe a technique for optimizing the weak initial segmentation and the AdaBoost learning parameters, (2) quantify the ability to replicate the multi-atlas result with mean accuracies approaching the multi-atlas intra-subject reproducibility on a testing set of 380 images, (3) demonstrate significant increases in the reproducibility of intra-subject segmentations when compared to a state-of-the-art multi-atlas framework on a separate reproducibility dataset, (4) show that under the MLF framework the large-scale data model significantly improve the segmentation over the small-scale model under the MLF framework, and (5) indicate that the MLF framework has comparable performance as state-of-the-art multi-atlas segmentation algorithms without using non-local information. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Selecting Indicator Portfolios for Marine Species and Food Webs: A Puget Sound Case Study

    PubMed Central

    Kershner, Jessi; Samhouri, Jameal F.; James, C. Andrew; Levin, Phillip S.

    2011-01-01

    Ecosystem-based management (EBM) has emerged as a promising approach for maintaining the benefits humans want and need from the ocean, yet concrete approaches for implementing EBM remain scarce. A key challenge lies in the development of indicators that can provide useful information on ecosystem status and trends, and assess progress towards management goals. In this paper, we describe a generalized framework for the methodical and transparent selection of ecosystem indicators. We apply the framework to the second largest estuary in the United States – Puget Sound, Washington – where one of the most advanced EBM processes is currently underway. Rather than introduce a new method, this paper integrates a variety of familiar approaches into one step-by-step approach that will lead to more consistent and reliable reporting on ecosystem condition. Importantly, we demonstrate how a framework linking indicators to policy goals, as well as a clearly defined indicator evaluation and scoring process, can result in a portfolio of useful and complementary indicators based on the needs of different users (e.g., policy makers and scientists). Although the set of indicators described in this paper is specific to marine species and food webs, we provide a general approach that could be applied to any set of management objectives or ecological system. PMID:21991305

  16. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    PubMed Central

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  17. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less

  19. High Resolution, Large Deformation 3D Traction Force Microscopy

    PubMed Central

    López-Fagundo, Cristina; Reichner, Jonathan; Hoffman-Kim, Diane; Franck, Christian

    2014-01-01

    Traction Force Microscopy (TFM) is a powerful approach for quantifying cell-material interactions that over the last two decades has contributed significantly to our understanding of cellular mechanosensing and mechanotransduction. In addition, recent advances in three-dimensional (3D) imaging and traction force analysis (3D TFM) have highlighted the significance of the third dimension in influencing various cellular processes. Yet irrespective of dimensionality, almost all TFM approaches have relied on a linear elastic theory framework to calculate cell surface tractions. Here we present a new high resolution 3D TFM algorithm which utilizes a large deformation formulation to quantify cellular displacement fields with unprecedented resolution. The results feature some of the first experimental evidence that cells are indeed capable of exerting large material deformations, which require the formulation of a new theoretical TFM framework to accurately calculate the traction forces. Based on our previous 3D TFM technique, we reformulate our approach to accurately account for large material deformation and quantitatively contrast and compare both linear and large deformation frameworks as a function of the applied cell deformation. Particular attention is paid in estimating the accuracy penalty associated with utilizing a traditional linear elastic approach in the presence of large deformation gradients. PMID:24740435

  20. An approach to addressing governance from a health system framework perspective

    PubMed Central

    2011-01-01

    As countries strive to strengthen their health systems in resource constrained contexts, policy makers need to know how best to improve the performance of their health systems. To aid these decisions, health system stewards should have a good understanding of how health systems operate in order to govern them appropriately. While a number of frameworks for assessing governance in the health sector have been proposed, their application is often hindered by unrealistic indicators or they are overly complex resulting in limited empirical work on governance in health systems. This paper reviews contemporary health sector frameworks which have focused on defining and developing indicators to assess governance in the health sector. Based on these, we propose a simplified approach to look at governance within a common health system framework which encourages stewards to take a systematic perspective when assessing governance. Although systems thinking is not unique to health, examples of its application within health systems has been limited. We also provide an example of how this approach could be applied to illuminate areas of governance weaknesses which are potentially addressable by targeted interventions and policies. This approach is built largely on prior literature, but is original in that it is problem-driven and promotes an outward application taking into consideration the major health system building blocks at various levels in order to ensure a more complete assessment of a governance issue rather than a simple input-output approach. Based on an assessment of contemporary literature we propose a practical approach which we believe will facilitate a more comprehensive assessment of governance in health systems leading to the development of governance interventions to strengthen system performance and improve health as a basic human right. PMID:22136318

  1. The Open-Ended Approach Framework

    ERIC Educational Resources Information Center

    Munroe, Lloyd

    2015-01-01

    This paper describes a pedagogical framework that teachers can use to support students who are engaged in solving open-ended problems, by explaining how two Japanese expert teachers successfully apply open-ended problems in their mathematics class. The Open-Ended Approach (OPA) framework consists of two main sections: Understanding Mathematical…

  2. An Instructional Design Framework for Fostering Student Engagement in Online Learning Environments

    ERIC Educational Resources Information Center

    Czerkawski, Betul C.; Lyman, Eugene W.

    2016-01-01

    Many approaches, models and frameworks exist when designing quality online learning environments. These approaches assist and guide instructional designers through the process of analysis, design, development, implementation and evaluation of instructional processes. Some of these frameworks are concerned with student participation, some with…

  3. An approach to multiscale modelling with graph grammars

    PubMed Central

    Ong, Yongzhi; Streit, Katarína; Henke, Michael; Kurth, Winfried

    2014-01-01

    Background and Aims Functional–structural plant models (FSPMs) simulate biological processes at different spatial scales. Methods exist for multiscale data representation and modification, but the advantages of using multiple scales in the dynamic aspects of FSPMs remain unclear. Results from multiscale models in various other areas of science that share fundamental modelling issues with FSPMs suggest that potential advantages do exist, and this study therefore aims to introduce an approach to multiscale modelling in FSPMs. Methods A three-part graph data structure and grammar is revisited, and presented with a conceptual framework for multiscale modelling. The framework is used for identifying roles, categorizing and describing scale-to-scale interactions, thus allowing alternative approaches to model development as opposed to correlation-based modelling at a single scale. Reverse information flow (from macro- to micro-scale) is catered for in the framework. The methods are implemented within the programming language XL. Key Results Three example models are implemented using the proposed multiscale graph model and framework. The first illustrates the fundamental usage of the graph data structure and grammar, the second uses probabilistic modelling for organs at the fine scale in order to derive crown growth, and the third combines multiscale plant topology with ozone trends and metabolic network simulations in order to model juvenile beech stands under exposure to a toxic trace gas. Conclusions The graph data structure supports data representation and grammar operations at multiple scales. The results demonstrate that multiscale modelling is a viable method in FSPM and an alternative to correlation-based modelling. Advantages and disadvantages of multiscale modelling are illustrated by comparisons with single-scale implementations, leading to motivations for further research in sensitivity analysis and run-time efficiency for these models. PMID:25134929

  4. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporatesmore » deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.« less

  5. Systematic narrative review of decision frameworks to select the appropriate modelling approaches for health economic evaluations.

    PubMed

    Tsoi, B; O'Reilly, D; Jegathisawaran, J; Tarride, J-E; Blackhouse, G; Goeree, R

    2015-06-17

    In constructing or appraising a health economic model, an early consideration is whether the modelling approach selected is appropriate for the given decision problem. Frameworks and taxonomies that distinguish between modelling approaches can help make this decision more systematic and this study aims to identify and compare the decision frameworks proposed to date on this topic area. A systematic review was conducted to identify frameworks from peer-reviewed and grey literature sources. The following databases were searched: OVID Medline and EMBASE; Wiley's Cochrane Library and Health Economic Evaluation Database; PubMed; and ProQuest. Eight decision frameworks were identified, each focused on a different set of modelling approaches and employing a different collection of selection criterion. The selection criteria can be categorized as either: (i) structural features (i.e. technical elements that are factual in nature) or (ii) practical considerations (i.e. context-dependent attributes). The most commonly mentioned structural features were population resolution (i.e. aggregate vs. individual) and interactivity (i.e. static vs. dynamic). Furthermore, understanding the needs of the end-users and stakeholders was frequently incorporated as a criterion within these frameworks. There is presently no universally-accepted framework for selecting an economic modelling approach. Rather, each highlights different criteria that may be of importance when determining whether a modelling approach is appropriate. Further discussion is thus necessary as the modelling approach selected will impact the validity of the underlying economic model and have downstream implications on its efficiency, transparency and relevance to decision-makers.

  6. Making sense in a complex landscape: how the Cynefin Framework from Complex Adaptive Systems Theory can inform health promotion practice.

    PubMed

    Van Beurden, Eric K; Kia, Annie M; Zask, Avigdor; Dietrich, Uta; Rose, Lauren

    2013-03-01

    Health promotion addresses issues from the simple (with well-known cause/effect links) to the highly complex (webs and loops of cause/effect with unpredictable, emergent properties). Yet there is no conceptual framework within its theory base to help identify approaches appropriate to the level of complexity. The default approach favours reductionism--the assumption that reducing a system to its parts will inform whole system behaviour. Such an approach can yield useful knowledge, yet is inadequate where issues have multiple interacting causes, such as social determinants of health. To address complex issues, there is a need for a conceptual framework that helps choose action that is appropriate to context. This paper presents the Cynefin Framework, informed by complexity science--the study of Complex Adaptive Systems (CAS). It introduces key CAS concepts and reviews the emergence and implications of 'complex' approaches within health promotion. It explains the framework and its use with examples from contemporary practice, and sets it within the context of related bodies of health promotion theory. The Cynefin Framework, especially when used as a sense-making tool, can help practitioners understand the complexity of issues, identify appropriate strategies and avoid the pitfalls of applying reductionist approaches to complex situations. The urgency to address critical issues such as climate change and the social determinants of health calls for us to engage with complexity science. The Cynefin Framework helps practitioners make the shift, and enables those already engaged in complex approaches to communicate the value and meaning of their work in a system that privileges reductionist approaches.

  7. Semantic Web Services Challenge, Results from the First Year. Series: Semantic Web And Beyond, Volume 8.

    NASA Astrophysics Data System (ADS)

    Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.

    Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.

  8. Incorporating equity considerations in transport infrastructure evaluation: Current practice and a proposed methodology.

    PubMed

    Thomopoulos, N; Grant-Muller, S; Tight, M R

    2009-11-01

    Interest has re-emerged on the issue of how to incorporate equity considerations in the appraisal of transport projects and large road infrastructure projects in particular. This paper offers a way forward in addressing some of the theoretical and practical concerns that have presented difficulties to date in incorporating equity concerns in the appraisal of such projects. Initially an overview of current practice within transport regarding the appraisal of equity considerations in Europe is offered based on an extensive literature review. Acknowledging the value of a framework approach, research towards introducing a theoretical framework is then presented. The proposed framework is based on the well established MCA Analytic Hierarchy Process and is also contrasted with the use of a CBA based approach. The framework outlined here offers an additional support tool to decision makers who will be able to differentiate choices based on their views on specific equity principles and equity types. It also holds the potential to become a valuable tool for evaluators as a result of the option to assess predefined equity perspectives of decision makers against both the project objectives and the estimated project impacts. This framework may also be of further value to evaluators outside transport.

  9. A superpixel-based framework for automatic tumor segmentation on breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Yu, Ning; Wu, Jia; Weinstein, Susan P.; Gaonkar, Bilwaj; Keller, Brad M.; Ashraf, Ahmed B.; Jiang, YunQing; Davatzikos, Christos; Conant, Emily F.; Kontos, Despina

    2015-03-01

    Accurate and efficient automated tumor segmentation in breast dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is highly desirable for computer-aided tumor diagnosis. We propose a novel automatic segmentation framework which incorporates mean-shift smoothing, superpixel-wise classification, pixel-wise graph-cuts partitioning, and morphological refinement. A set of 15 breast DCE-MR images, obtained from the American College of Radiology Imaging Network (ACRIN) 6657 I-SPY trial, were manually segmented to generate tumor masks (as ground truth) and breast masks (as regions of interest). Four state-of-the-art segmentation approaches based on diverse models were also utilized for comparison. Based on five standard evaluation metrics for segmentation, the proposed framework consistently outperformed all other approaches. The performance of the proposed framework was: 1) 0.83 for Dice similarity coefficient, 2) 0.96 for pixel-wise accuracy, 3) 0.72 for VOC score, 4) 0.79 mm for mean absolute difference, and 5) 11.71 mm for maximum Hausdorff distance, which surpassed the second best method (i.e., adaptive geodesic transformation), a semi-automatic algorithm depending on precise initialization. Our results suggest promising potential applications of our segmentation framework in assisting analysis of breast carcinomas.

  10. A process dissociation approach to objective-projective test score interrelationships.

    PubMed

    Bornstein, Robert F

    2002-02-01

    Even when self-report and projective measures of a given trait or motive both predict theoretically related features of behavior, scores on the 2 tests correlate modestly with each other. This article describes a process dissociation framework for personality assessment, derived from research on implicit memory and learning, which can resolve these ostensibly conflicting results. Research on interpersonal dependency is used to illustrate 3 key steps in the process dissociation approach: (a) converging behavioral predictions, (b) modest test score intercorrelations, and (c) delineation of variables that differentially affect self-report and projective test scores. Implications of the process dissociation framework for personality assessment and test development are discussed.

  11. Composable Framework Support for Software-FMEA Through Model Execution

    NASA Astrophysics Data System (ADS)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  12. A Layered Approach for Robust Spatial Virtual Human Pose Reconstruction Using a Still Image

    PubMed Central

    Guo, Chengyu; Ruan, Songsong; Liang, Xiaohui; Zhao, Qinping

    2016-01-01

    Pedestrian detection and human pose estimation are instructive for reconstructing a three-dimensional scenario and for robot navigation, particularly when large amounts of vision data are captured using various data-recording techniques. Using an unrestricted capture scheme, which produces occlusions or breezing, the information describing each part of a human body and the relationship between each part or even different pedestrians must be present in a still image. Using this framework, a multi-layered, spatial, virtual, human pose reconstruction framework is presented in this study to recover any deficient information in planar images. In this framework, a hierarchical parts-based deep model is used to detect body parts by using the available restricted information in a still image and is then combined with spatial Markov random fields to re-estimate the accurate joint positions in the deep network. Then, the planar estimation results are mapped onto a virtual three-dimensional space using multiple constraints to recover any deficient spatial information. The proposed approach can be viewed as a general pre-processing method to guide the generation of continuous, three-dimensional motion data. The experiment results of this study are used to describe the effectiveness and usability of the proposed approach. PMID:26907289

  13. Face liveness detection using shearlet-based feature descriptors

    NASA Astrophysics Data System (ADS)

    Feng, Litong; Po, Lai-Man; Li, Yuming; Yuan, Fang

    2016-07-01

    Face recognition is a widely used biometric technology due to its convenience but it is vulnerable to spoofing attacks made by nonreal faces such as photographs or videos of valid users. The antispoof problem must be well resolved before widely applying face recognition in our daily life. Face liveness detection is a core technology to make sure that the input face is a live person. However, this is still very challenging using conventional liveness detection approaches of texture analysis and motion detection. The aim of this paper is to propose a feature descriptor and an efficient framework that can be used to effectively deal with the face liveness detection problem. In this framework, new feature descriptors are defined using a multiscale directional transform (shearlet transform). Then, stacked autoencoders and a softmax classifier are concatenated to detect face liveness. We evaluated this approach using the CASIA Face antispoofing database and replay-attack database. The experimental results show that our approach performs better than the state-of-the-art techniques following the provided protocols of these databases, and it is possible to significantly enhance the security of the face recognition biometric system. In addition, the experimental results also demonstrate that this framework can be easily extended to classify different spoofing attacks.

  14. A framework for multivariate data-based at-site flood frequency analysis: Essentiality of the conjugal application of parametric and nonparametric approaches

    NASA Astrophysics Data System (ADS)

    Vittal, H.; Singh, Jitendra; Kumar, Pankaj; Karmakar, Subhankar

    2015-06-01

    In watershed management, flood frequency analysis (FFA) is performed to quantify the risk of flooding at different spatial locations and also to provide guidelines for determining the design periods of flood control structures. The traditional FFA was extensively performed by considering univariate scenario for both at-site and regional estimation of return periods. However, due to inherent mutual dependence of the flood variables or characteristics [i.e., peak flow (P), flood volume (V) and flood duration (D), which are random in nature], analysis has been further extended to multivariate scenario, with some restrictive assumptions. To overcome the assumption of same family of marginal density function for all flood variables, the concept of copula has been introduced. Although, the advancement from univariate to multivariate analyses drew formidable attention to the FFA research community, the basic limitation was that the analyses were performed with the implementation of only parametric family of distributions. The aim of the current study is to emphasize the importance of nonparametric approaches in the field of multivariate FFA; however, the nonparametric distribution may not always be a good-fit and capable of replacing well-implemented multivariate parametric and multivariate copula-based applications. Nevertheless, the potential of obtaining best-fit using nonparametric distributions might be improved because such distributions reproduce the sample's characteristics, resulting in more accurate estimations of the multivariate return period. Hence, the current study shows the importance of conjugating multivariate nonparametric approach with multivariate parametric and copula-based approaches, thereby results in a comprehensive framework for complete at-site FFA. Although the proposed framework is designed for at-site FFA, this approach can also be applied to regional FFA because regional estimations ideally include at-site estimations. The framework is based on the following steps: (i) comprehensive trend analysis to assess nonstationarity in the observed data; (ii) selection of the best-fit univariate marginal distribution with a comprehensive set of parametric and nonparametric distributions for the flood variables; (iii) multivariate frequency analyses with parametric, copula-based and nonparametric approaches; and (iv) estimation of joint and various conditional return periods. The proposed framework for frequency analysis is demonstrated using 110 years of observed data from Allegheny River at Salamanca, New York, USA. The results show that for both univariate and multivariate cases, the nonparametric Gaussian kernel provides the best estimate. Further, we perform FFA for twenty major rivers over continental USA, which shows for seven rivers, all the flood variables followed nonparametric Gaussian kernel; whereas for other rivers, parametric distributions provide the best-fit either for one or two flood variables. Thus the summary of results shows that the nonparametric method cannot substitute the parametric and copula-based approaches, but should be considered during any at-site FFA to provide the broadest choices for best estimation of the flood return periods.

  15. Lattice enumeration for inverse molecular design using the signature descriptor.

    PubMed

    Martin, Shawn

    2012-07-23

    We describe an inverse quantitative structure-activity relationship (QSAR) framework developed for the design of molecular structures with desired properties. This framework uses chemical fragments encoded with a molecular descriptor known as a signature. It solves a system of linear constrained Diophantine equations to reorganize the fragments into novel molecular structures. The method has been previously applied to problems in drug and materials design but has inherent computational limitations due to the necessity of solving the Diophantine constraints. We propose a new approach to overcome these limitations using the Fincke-Pohst algorithm for lattice enumeration. We benchmark the new approach against previous results on LFA-1/ICAM-1 inhibitory peptides, linear homopolymers, and hydrofluoroether foam blowing agents. Software implementing the new approach is available at www.cs.otago.ac.nz/homepages/smartin.

  16. Revisiting the Concepts "Approach", "Design" and "Procedure" According to the Richards and Rodgers (2011) Framework

    ERIC Educational Resources Information Center

    Cumming, Brett

    2012-01-01

    The three concepts Approach, Design and Procedure as proposed in Rodgers' Framework are considered particularly effective as a framework in second language teaching with the specific aim of developing communication as well as for better understanding methodology in the use of communicative language use.

  17. Australian Recognition Framework Arrangements. Australia's National Training Framework.

    ERIC Educational Resources Information Center

    Australian National Training Authority, Brisbane.

    This document explains the objectives, principles, standards, and protocols of the Australian Recognition Framework (ARF), which is a comprehensive approach to national recognition of vocational education and training (VET) that is based on a quality-assured approach to the registration of training organizations seeking to deliver training, assess…

  18. A Competency Approach to Developing Leaders--Is This Approach Effective?

    ERIC Educational Resources Information Center

    Richards, Patricia

    2008-01-01

    This paper examines the underlying assumptions that competency-based frameworks are based upon in relation to leadership development. It examines the impetus for this framework becoming the prevailing theoretical base for developing leaders and tracks the historical path to this phenomenon. Research suggests that a competency-based framework may…

  19. A Mixed Integer Efficient Global Optimization Framework: Applied to the Simultaneous Aircraft Design, Airline Allocation and Revenue Management Problem

    NASA Astrophysics Data System (ADS)

    Roy, Satadru

    Traditional approaches to design and optimize a new system, often, use a system-centric objective and do not take into consideration how the operator will use this new system alongside of other existing systems. This "hand-off" between the design of the new system and how the new system operates alongside other systems might lead to a sub-optimal performance with respect to the operator-level objective. In other words, the system that is optimal for its system-level objective might not be best for the system-of-systems level objective of the operator. Among the few available references that describe attempts to address this hand-off, most follow an MDO-motivated subspace decomposition approach of first designing a very good system and then provide this system to the operator who decides the best way to use this new system along with the existing systems. The motivating example in this dissertation presents one such similar problem that includes aircraft design, airline operations and revenue management "subspaces". The research here develops an approach that could simultaneously solve these subspaces posed as a monolithic optimization problem. The monolithic approach makes the problem a Mixed Integer/Discrete Non-Linear Programming (MINLP/MDNLP) problem, which are extremely difficult to solve. The presence of expensive, sophisticated engineering analyses further aggravate the problem. To tackle this challenge problem, the work here presents a new optimization framework that simultaneously solves the subspaces to capture the "synergism" in the problem that the previous decomposition approaches may not have exploited, addresses mixed-integer/discrete type design variables in an efficient manner, and accounts for computationally expensive analysis tools. The framework combines concepts from efficient global optimization, Kriging partial least squares, and gradient-based optimization. This approach then demonstrates its ability to solve an 11 route airline network problem consisting of 94 decision variables including 33 integer and 61 continuous type variables. This application problem is a representation of an interacting group of systems and provides key challenges to the optimization framework to solve the MINLP problem, as reflected by the presence of a moderate number of integer and continuous type design variables and expensive analysis tool. The result indicates simultaneously solving the subspaces could lead to significant improvement in the fleet-level objective of the airline when compared to the previously developed sequential subspace decomposition approach. In developing the approach to solve the MINLP/MDNLP challenge problem, several test problems provided the ability to explore performance of the framework. While solving these test problems, the framework showed that it could solve other MDNLP problems including categorically discrete variables, indicating that the framework could have broader application than the new aircraft design-fleet allocation-revenue management problem.

  20. Dimensionality and R4P: A Health Equity Framework for Research Planning and Evaluation in African American Populations.

    PubMed

    Hogan, Vijaya; Rowley, Diane L; White, Stephanie Baker; Faustin, Yanica

    2018-02-01

    Introduction Existing health disparities frameworks do not adequately incorporate unique interacting contributing factors leading to health inequities among African Americans, resulting in public health stakeholders' inability to translate these frameworks into practice. Methods We developed dimensionality and R4P to integrate multiple theoretical perspectives into a framework of action to eliminate health inequities experienced by African Americans. Results The dimensional framework incorporates Critical Race Theory and intersectionality, and includes dimensions of time-past, present and future. Dimensionality captures the complex linear and non-linear array of influences that cause health inequities, but these pathways do not lend themselves to approaches to developing empirically derived programs, policies and interventions to promote health equity. R4P provides a framework for addressing the scope of actions needed. The five components of R4P are (1) Remove, (2) Repair, (3) Remediate, (4) Restructure and (5) Provide. Conclusion R4P is designed to translate complex causality into a public health equity planning, assessment, evaluation and research tool.

  1. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    ERIC Educational Resources Information Center

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  2. Online anomaly detection in wireless body area networks for reliable healthcare monitoring.

    PubMed

    Salem, Osman; Liu, Yaning; Mehaoua, Ahmed; Boutaba, Raouf

    2014-09-01

    In this paper, we propose a lightweight approach for online detection of faulty measurements by analyzing the data collected from medical wireless body area networks. The proposed framework performs sequential data analysis using a smart phone as a base station, and takes into account the constrained resources of the smart phone, such as processing power and storage capacity. The main objective is to raise alarms only when patients enter in an emergency situation, and to discard false alarms triggered by faulty measurements or ill-behaved sensors. The proposed approach is based on the Haar wavelet decomposition, nonseasonal Holt-Winters forecasting, and the Hampel filter for spatial analysis, and on for temporal analysis. Our objective is to reduce false alarms resulting from unreliable measurements and to reduce unnecessary healthcare intervention. We apply our proposed approach on real physiological dataset. Our experimental results prove the effectiveness of our approach in achieving good detection accuracy with a low false alarm rate. The simplicity and the processing speed of our proposed framework make it useful and efficient for real time diagnosis.

  3. A Decision Support Framework for Feasibility Analysis of International Space Station (ISS) Research Capability Enhancing Options

    NASA Technical Reports Server (NTRS)

    Ortiz, James N.; Scott,Kelly; Smith, Harold

    2004-01-01

    The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.

  4. Comparing health system performance assessment and management approaches in the Netherlands and Ontario, Canada

    PubMed Central

    Tawfik-Shukor, Ali R; Klazinga, Niek S; Arah, Onyebuchi A

    2007-01-01

    Background Given the proliferation and the growing complexity of performance measurement initiatives in many health systems, the Netherlands and Ontario, Canada expressed interests in cross-national comparisons in an effort to promote knowledge transfer and best practise. To support this cross-national learning, a study was undertaken to compare health system performance approaches in The Netherlands with Ontario, Canada. Methods We explored the performance assessment framework and system of each constituency, the embeddedness of performance data in management and policy processes, and the interrelationships between the frameworks. Methods used included analysing governmental strategic planning and policy documents, literature and internet searches, comparative descriptive tables, and schematics. Data collection and analysis took place in Ontario and The Netherlands. A workshop to validate and discuss the findings was conducted in Toronto, adding important insights to the study. Results Both Ontario and The Netherlands conceive health system performance within supportive frameworks. However they differ in their assessment approaches. Ontario's Scorecard links performance measurement with strategy, aimed at health system integration. The Dutch Health Care Performance Report (Zorgbalans) does not explicitly link performance with strategy, and focuses on the technical quality of healthcare by measuring dimensions of quality, access, and cost against healthcare needs. A backbone 'five diamond' framework maps both frameworks and articulates the interrelations and overlap between their goals, themes, dimensions and indicators. The workshop yielded more contextual insights and further validated the comparative values of each constituency's performance assessment system. Conclusion To compare the health system performance approaches between The Netherlands and Ontario, Canada, several important conceptual and contextual issues must be addressed, before even attempting any future content comparisons and benchmarking. Such issues would lend relevant interpretational credibility to international comparative assessments of the two health systems. PMID:17319947

  5. A Mathematical Framework for Image Analysis

    DTIC Science & Technology

    1991-08-01

    The results reported here were derived from the research project ’A Mathematical Framework for Image Analysis ’ supported by the Office of Naval...Research, contract N00014-88-K-0289 to Brown University. A common theme for the work reported is the use of probabilistic methods for problems in image ... analysis and image reconstruction. Five areas of research are described: rigid body recognition using a decision tree/combinatorial approach; nonrigid

  6. Modern vitiligo genetics sheds new light on an ancient disease

    PubMed Central

    SPRITZ, Richard A.

    2013-01-01

    Vitiligo is a complex disorder in which autoimmune destruction of melanocytes results in white patches of skin and overlying hair. Over the past several years, extensive genetic studies have outlined a biological framework of vitiligo pathobiology that underscores its relationship to other autoimmune diseases. This biological framework offers insight into both vitiligo pathogenesis and perhaps avenues towards more effective approaches to treatment and even disease prevention. PMID:23668538

  7. A Temporal Mining Framework for Classifying Un-Evenly Spaced Clinical Data: An Approach for Building Effective Clinical Decision-Making System.

    PubMed

    Jane, Nancy Yesudhas; Nehemiah, Khanna Harichandran; Arputharaj, Kannan

    2016-01-01

    Clinical time-series data acquired from electronic health records (EHR) are liable to temporal complexities such as irregular observations, missing values and time constrained attributes that make the knowledge discovery process challenging. This paper presents a temporal rough set induced neuro-fuzzy (TRiNF) mining framework that handles these complexities and builds an effective clinical decision-making system. TRiNF provides two functionalities namely temporal data acquisition (TDA) and temporal classification. In TDA, a time-series forecasting model is constructed by adopting an improved double exponential smoothing method. The forecasting model is used in missing value imputation and temporal pattern extraction. The relevant attributes are selected using a temporal pattern based rough set approach. In temporal classification, a classification model is built with the selected attributes using a temporal pattern induced neuro-fuzzy classifier. For experimentation, this work uses two clinical time series dataset of hepatitis and thrombosis patients. The experimental result shows that with the proposed TRiNF framework, there is a significant reduction in the error rate, thereby obtaining the classification accuracy on an average of 92.59% for hepatitis and 91.69% for thrombosis dataset. The obtained classification results prove the efficiency of the proposed framework in terms of its improved classification accuracy.

  8. Humanization of the anti-CD18 antibody 6.7: an unexpected effect of a framework residue in binding to antigen.

    PubMed

    Caldas, Cristina; Coelho, Verônica; Kalil, Jorge; Moro, Ana Maria; Maranhão, Andrea Q; Brígido, Marcelo M

    2003-05-01

    Humanization of monoclonal antibodies by complementary determinant region (CDR)-grafting has become a standard procedure to improve the clinical usage of animal antibodies. However, antibody humanization may result in loss of activity that has been attributed to structural constraints in the framework structure. In this paper, we report the complete humanization of the 6.7 anti-human CD18 monoclonal antibody in a scFv form. We used a germline-based approach to design a humanized VL gene fragment and expressed it together with a previously described humanized VH. The designed humanized VL has only 14 mutations compared to the closest human germline sequence. The resulting humanized scFv maintained the binding capacity and specificity to human CD18 expressed on the cell surface of peripheral blood mononuclear cells (PBMC), and showed the same pattern of staining T-lymphocytes sub-populations, in comparison to the original monoclonal antibody. We observed an unexpected effect of a conserved mouse-human framework position (L37) that hinders the binding of the humanized scFv to antigen. This paper reveals a new framework residue that interferes with paratope and antigen binding and also reinforces the germline approach as a successful strategy to humanize antibodies.

  9. Fundamental Studies of Crystal Growth of Microporous Materials

    NASA Technical Reports Server (NTRS)

    Singh, Ramsharan; Doolittle, John, Jr.; Payra, Pramatha; Dutta, Prabir K.; George, Michael A.; Ramachandran, Narayanan; Schoeman, Brian J.

    2003-01-01

    Microporous materials are framework structures with well-defined porosity, often of molecular dimensions. Zeolites contain aluminum and silicon atoms in their framework and are the most extensively studied amongst all microporous materials. Framework structures with P, Ga, Fe, Co, Zn, B, Ti and a host of other elements have also been made. Typical synthesis of microporous materials involve mixing the framework elements (or compounds, thereof) in a basic solution, followed by aging in some cases and then heating at elevated temperatures. This process is termed hydrothermal synthesis, and involves complex chemical and physical changes. Because of a limited understanding of this process, most synthesis advancements happen by a trial and error approach. There is considerable interest in understanding the synthesis process at a molecular level with the expectation that eventually new framework structures will be built by design. The basic issues in the microporous materials crystallization process include: (a) Nature of the molecular units responsible for the crystal nuclei formation; (b) Nature of the nuclei and nucleation process; (c) Growth process of the nuclei into crystal; (d) Morphological control and size of the resulting crystal; (e) Surface structure of the resulting crystals; and (f) Transformation of frameworks into other frameworks or condensed structures.

  10. Mathematical Problem Solving Ability of Junior High School Students through Ang’s Framework for Mathematical Modelling Instruction

    NASA Astrophysics Data System (ADS)

    Fasni, N.; Turmudi, T.; Kusnandi, K.

    2017-09-01

    This research background of this research is the importance of student problem solving abilities. The purpose of this study is to find out whether there are differences in the ability to solve mathematical problems between students who have learned mathematics using Ang’s Framework for Mathematical Modelling Instruction (AFFMMI) and students who have learned using scientific approach (SA). The method used in this research is a quasi-experimental method with pretest-postest control group design. Data analysis of mathematical problem solving ability using Indepent Sample Test. The results showed that there was a difference in the ability to solve mathematical problems between students who received learning with Ang’s Framework for Mathematical Modelling Instruction and students who received learning with a scientific approach. AFFMMI focuses on mathematical modeling. This modeling allows students to solve problems. The use of AFFMMI is able to improve the solving ability.

  11. When procedures discourage insight: epistemological consequences of prompting novice physics students to construct force diagrams

    NASA Astrophysics Data System (ADS)

    Kuo, Eric; Hallinen, Nicole R.; Conlin, Luke D.

    2017-05-01

    One aim of school science instruction is to help students become adaptive problem solvers. Though successful at structuring novice problem solving, step-by-step problem-solving frameworks may also constrain students' thinking. This study utilises a paradigm established by Heckler [(2010). Some consequences of prompting novice physics students to construct force diagrams. International Journal of Science Education, 32(14), 1829-1851] to test how cuing the first step in a standard framework affects undergraduate students' approaches and evaluation of solutions in physics problem solving. Specifically, prompting the construction of a standard diagram before problem solving increases the use of standard procedures, decreasing the use of a conceptual shortcut. Providing a diagram prompt also lowers students' ratings of informal approaches to similar problems. These results suggest that reminding students to follow typical problem-solving frameworks limits their views of what counts as good problem solving.

  12. Artificial intelligence framework for simulating clinical decision-making: a Markov decision process approach.

    PubMed

    Bennett, Casey C; Hauser, Kris

    2013-01-01

    In the modern healthcare system, rapidly expanding costs/complexity, the growing myriad of treatment options, and exploding information streams that often do not effectively reach the front lines hinder the ability to choose optimal treatment decisions over time. The goal in this paper is to develop a general purpose (non-disease-specific) computational/artificial intelligence (AI) framework to address these challenges. This framework serves two potential functions: (1) a simulation environment for exploring various healthcare policies, payment methodologies, etc., and (2) the basis for clinical artificial intelligence - an AI that can "think like a doctor". This approach combines Markov decision processes and dynamic decision networks to learn from clinical data and develop complex plans via simulation of alternative sequential decision paths while capturing the sometimes conflicting, sometimes synergistic interactions of various components in the healthcare system. It can operate in partially observable environments (in the case of missing observations or data) by maintaining belief states about patient health status and functions as an online agent that plans and re-plans as actions are performed and new observations are obtained. This framework was evaluated using real patient data from an electronic health record. The results demonstrate the feasibility of this approach; such an AI framework easily outperforms the current treatment-as-usual (TAU) case-rate/fee-for-service models of healthcare. The cost per unit of outcome change (CPUC) was $189 vs. $497 for AI vs. TAU (where lower is considered optimal) - while at the same time the AI approach could obtain a 30-35% increase in patient outcomes. Tweaking certain AI model parameters could further enhance this advantage, obtaining approximately 50% more improvement (outcome change) for roughly half the costs. Given careful design and problem formulation, an AI simulation framework can approximate optimal decisions even in complex and uncertain environments. Future work is described that outlines potential lines of research and integration of machine learning algorithms for personalized medicine. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. A structured framework for assessing sensitivity to missing data assumptions in longitudinal clinical trials.

    PubMed

    Mallinckrodt, C H; Lin, Q; Molenberghs, M

    2013-01-01

    The objective of this research was to demonstrate a framework for drawing inference from sensitivity analyses of incomplete longitudinal clinical trial data via a re-analysis of data from a confirmatory clinical trial in depression. A likelihood-based approach that assumed missing at random (MAR) was the primary analysis. Robustness to departure from MAR was assessed by comparing the primary result to those from a series of analyses that employed varying missing not at random (MNAR) assumptions (selection models, pattern mixture models and shared parameter models) and to MAR methods that used inclusive models. The key sensitivity analysis used multiple imputation assuming that after dropout the trajectory of drug-treated patients was that of placebo treated patients with a similar outcome history (placebo multiple imputation). This result was used as the worst reasonable case to define the lower limit of plausible values for the treatment contrast. The endpoint contrast from the primary analysis was - 2.79 (p = .013). In placebo multiple imputation, the result was - 2.17. Results from the other sensitivity analyses ranged from - 2.21 to - 3.87 and were symmetrically distributed around the primary result. Hence, no clear evidence of bias from missing not at random data was found. In the worst reasonable case scenario, the treatment effect was 80% of the magnitude of the primary result. Therefore, it was concluded that a treatment effect existed. The structured sensitivity framework of using a worst reasonable case result based on a controlled imputation approach with transparent and debatable assumptions supplemented a series of plausible alternative models under varying assumptions was useful in this specific situation and holds promise as a generally useful framework. Copyright © 2012 John Wiley & Sons, Ltd.

  14. A Framework for Final Drive Simultaneous Failure Diagnosis Based on Fuzzy Entropy and Sparse Bayesian Extreme Learning Machine

    PubMed Central

    Ye, Qing; Pan, Hao; Liu, Changhua

    2015-01-01

    This research proposes a novel framework of final drive simultaneous failure diagnosis containing feature extraction, training paired diagnostic models, generating decision threshold, and recognizing simultaneous failure modes. In feature extraction module, adopt wavelet package transform and fuzzy entropy to reduce noise interference and extract representative features of failure mode. Use single failure sample to construct probability classifiers based on paired sparse Bayesian extreme learning machine which is trained only by single failure modes and have high generalization and sparsity of sparse Bayesian learning approach. To generate optimal decision threshold which can convert probability output obtained from classifiers into final simultaneous failure modes, this research proposes using samples containing both single and simultaneous failure modes and Grid search method which is superior to traditional techniques in global optimization. Compared with other frequently used diagnostic approaches based on support vector machine and probability neural networks, experiment results based on F 1-measure value verify that the diagnostic accuracy and efficiency of the proposed framework which are crucial for simultaneous failure diagnosis are superior to the existing approach. PMID:25722717

  15. Bottom-Up Catalytic Approach towards Nitrogen-Enriched Mesoporous Carbons/Sulfur Composites for Superior Li-S Cathodes

    PubMed Central

    Sun, Fugen; Wang, Jitong; Chen, Huichao; Qiao, Wenming; Ling, Licheng; Long, Donghui

    2013-01-01

    We demonstrate a sustainable and efficient approach to produce high performance sulfur/carbon composite cathodes via a bottom-up catalytic approach. The selective oxidation of H2S by a nitrogen-enriched mesoporous carbon catalyst can produce elemental sulfur as a by-product which in-situ deposit onto the carbon framework. Due to the metal-free catalytic characteristic and high catalytic selectivity, the resulting sulfur/carbon composites have almost no impurities that thus can be used as cathode materials with compromising battery performance. The layer-by-layer sulfur deposition allows atomic sulfur binding strongly with carbon framework, providing efficient immobilization of sulfur. The nitrogen atoms doped on the carbon framework can increase the surface interactions with polysulfides, leading to the improvement in the trapping of polysulfides. Thus, the composites exhibit a reversible capacity of 939 mAh g−1 after 100 cycles at 0.2 C and an excellent rate capability of 527 mAh g−1 at 5 C after 70 cycles. PMID:24084754

  16. Intelligent Control of a Sensor-Actuator System via Kernelized Least-Squares Policy Iteration

    PubMed Central

    Liu, Bo; Chen, Sanfeng; Li, Shuai; Liang, Yongsheng

    2012-01-01

    In this paper a new framework, called Compressive Kernelized Reinforcement Learning (CKRL), for computing near-optimal policies in sequential decision making with uncertainty is proposed via incorporating the non-adaptive data-independent Random Projections and nonparametric Kernelized Least-squares Policy Iteration (KLSPI). Random Projections are a fast, non-adaptive dimensionality reduction framework in which high-dimensionality data is projected onto a random lower-dimension subspace via spherically random rotation and coordination sampling. KLSPI introduce kernel trick into the LSPI framework for Reinforcement Learning, often achieving faster convergence and providing automatic feature selection via various kernel sparsification approaches. In this approach, policies are computed in a low-dimensional subspace generated by projecting the high-dimensional features onto a set of random basis. We first show how Random Projections constitute an efficient sparsification technique and how our method often converges faster than regular LSPI, while at lower computational costs. Theoretical foundation underlying this approach is a fast approximation of Singular Value Decomposition (SVD). Finally, simulation results are exhibited on benchmark MDP domains, which confirm gains both in computation time and in performance in large feature spaces. PMID:22736969

  17. Unified framework for automated iris segmentation using distantly acquired face images.

    PubMed

    Tan, Chun-Wei; Kumar, Ajay

    2012-09-01

    Remote human identification using iris biometrics has high civilian and surveillance applications and its success requires the development of robust segmentation algorithm to automatically extract the iris region. This paper presents a new iris segmentation framework which can robustly segment the iris images acquired using near infrared or visible illumination. The proposed approach exploits multiple higher order local pixel dependencies to robustly classify the eye region pixels into iris or noniris regions. Face and eye detection modules have been incorporated in the unified framework to automatically provide the localized eye region from facial image for iris segmentation. We develop robust postprocessing operations algorithm to effectively mitigate the noisy pixels caused by the misclassification. Experimental results presented in this paper suggest significant improvement in the average segmentation errors over the previously proposed approaches, i.e., 47.5%, 34.1%, and 32.6% on UBIRIS.v2, FRGC, and CASIA.v4 at-a-distance databases, respectively. The usefulness of the proposed approach is also ascertained from recognition experiments on three different publicly available databases.

  18. Automatic segmentation of 4D cardiac MR images for extraction of ventricular chambers using a spatio-temporal approach

    NASA Astrophysics Data System (ADS)

    Atehortúa, Angélica; Zuluaga, Maria A.; Ourselin, Sébastien; Giraldo, Diana; Romero, Eduardo

    2016-03-01

    An accurate ventricular function quantification is important to support evaluation, diagnosis and prognosis of several cardiac pathologies. However, expert heart delineation, specifically for the right ventricle, is a time consuming task with high inter-and-intra observer variability. A fully automatic 3D+time heart segmentation framework is herein proposed for short-axis-cardiac MRI sequences. This approach estimates the heart using exclusively information from the sequence itself without tuning any parameters. The proposed framework uses a coarse-to-fine approach, which starts by localizing the heart via spatio-temporal analysis, followed by a segmentation of the basal heart that is then propagated to the apex by using a non-rigid-registration strategy. The obtained volume is then refined by estimating the ventricular muscle by locally searching a prior endocardium- pericardium intensity pattern. The proposed framework was applied to 48 patients datasets supplied by the organizers of the MICCAI 2012 Right Ventricle segmentation challenge. Results show the robustness, efficiency and competitiveness of the proposed method both in terms of accuracy and computational load.

  19. Development of a theoretical framework for analyzing cerebrospinal fluid dynamics

    PubMed Central

    Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy

    2009-01-01

    Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652

  20. A continuum dislocation dynamics framework for plasticity of polycrystalline materials

    NASA Astrophysics Data System (ADS)

    Askari, Hesam Aldin

    The objective of this research is to investigate the mechanical response of polycrystals in different settings to identify the mechanisms that give rise to specific response observed in the deformation process. Particularly the large deformation of magnesium alloys and yield properties of copper in small scales are investigated. We develop a continuum dislocation dynamics framework based on dislocation mechanisms and interaction laws and implement this formulation in a viscoplastic self-consistent scheme to obtain the mechanical response in a polycrystalline system. The versatility of this method allows various applications in the study of problems involving large deformation, study of microstructure and its evolution, superplasticity, study of size effect in polycrystals and stochastic plasticity. The findings from the numerical solution are compared to the experimental results to validate the simulation results. We apply this framework to study the deformation mechanisms in magnesium alloys at moderate to fast strain rates and room temperature to 450 °C. Experiments for the same range of strain rates and temperatures were carried out to obtain the mechanical and material properties, and to compare with the numerical results. The numerical approach for magnesium is divided into four main steps; 1) room temperature unidirectional loading 2) high temperature deformation without grain boundary sliding 3) high temperature with grain boundary sliding mechanism 4) room temperature cyclic loading. We demonstrate the capability of our modeling approach in prediction of mechanical properties and texture evolution and discuss the improvement obtained by using the continuum dislocation dynamics method. The framework was also applied to nano-sized copper polycrystals to study the yield properties at small scales and address the observed yield scatter. By combining our developed method with a Monte Carlo simulation approach, the stochastic plasticity at small length scales was studied and the sources of the uncertainty in the polycrystalline structure are discussed. Our results suggest that the stochastic response is mainly because of a) stochastic plasticity due to dislocation substructure inside crystals and b) the microstructure of the polycrystalline material. The extent of the uncertainty is correlated to the "effective cell length" in the sampling procedure whether using simulations and experimental approach.

  1. Towards a bulk approach to local interactions of hydrometeors

    NASA Astrophysics Data System (ADS)

    Baumgartner, Manuel; Spichtinger, Peter

    2018-02-01

    The growth of small cloud droplets and ice crystals is dominated by the diffusion of water vapor. Usually, Maxwell's approach to growth for isolated particles is used in describing this process. However, recent investigations show that local interactions between particles can change diffusion properties of cloud particles. In this study we develop an approach for including these local interactions into a bulk model approach. For this purpose, a simplified framework of local interaction is proposed and governing equations are derived from this setup. The new model is tested against direct simulations and incorporated into a parcel model framework. Using the parcel model, possible implications of the new model approach for clouds are investigated. The results indicate that for specific scenarios the lifetime of cloud droplets in subsaturated air may be longer (e.g., for an initially water supersaturated air parcel within a downdraft). These effects might have an impact on mixed-phase clouds, for example in terms of riming efficiencies.

  2. SmartMal: a service-oriented behavioral malware detection framework for mobile devices.

    PubMed

    Wang, Chao; Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C K

    2014-01-01

    This paper presents SmartMal--a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices.

  3. SmartMal: A Service-Oriented Behavioral Malware Detection Framework for Mobile Devices

    PubMed Central

    Wu, Zhizhong; Li, Xi; Zhou, Xuehai; Wang, Aili; Hung, Patrick C. K.

    2014-01-01

    This paper presents SmartMal—a novel service-oriented behavioral malware detection framework for vehicular and mobile devices. The highlight of SmartMal is to introduce service-oriented architecture (SOA) concepts and behavior analysis into the malware detection paradigms. The proposed framework relies on client-server architecture, the client continuously extracts various features and transfers them to the server, and the server's main task is to detect anomalies using state-of-art detection algorithms. Multiple distributed servers simultaneously analyze the feature vector using various detectors and information fusion is used to concatenate the results of detectors. We also propose a cycle-based statistical approach for mobile device anomaly detection. We accomplish this by analyzing the users' regular usage patterns. Empirical results suggest that the proposed framework and novel anomaly detection algorithm are highly effective in detecting malware on Android devices. PMID:25165729

  4. The Aeroacoustics of Turbulent Flows

    NASA Technical Reports Server (NTRS)

    Goldstein, M. E.

    2008-01-01

    Aerodynamic noise prediction has been an important and challenging research area since James Lighthill first introduced his Acoustic Analogy Approach over fifty years ago. This talk attempts to provide a unified framework for the subsequent theoretical developments in this field. It assumes that there is no single approach that is optimal in all situations and uses the framework as a basis for discussing the strengths weaknesses of the various approaches to this topic. But the emphasis here will be on the important problem of predicting the noise from high speed air jets. Specific results will presented for round jets in the 0.5 to 1.4 Mach number range and compared with experimental data taken on the Glenn SHAR rig. It is demonstrated that nonparallel mean flow effects play an important role in predicting the noise at the supersonic Mach numbers. The results explain the failure of previous attempts based on the parallel flow Lilley model (which has served as the foundation for most jet noise analyses during past two decades).

  5. A proposed approach to systematically identify and monitor the corporate political activity of the food industry with respect to public health using publicly available information.

    PubMed

    Mialon, M; Swinburn, B; Sacks, G

    2015-07-01

    Unhealthy diets represent one of the major risk factors for non-communicable diseases. There is currently a risk that the political influence of the food industry results in public health policies that do not adequately balance public and commercial interests. This paper aims to develop a framework for categorizing the corporate political activity of the food industry with respect to public health and proposes an approach to systematically identify and monitor it. The proposed framework includes six strategies used by the food industry to influence public health policies and outcomes: information and messaging; financial incentive; constituency building; legal; policy substitution; opposition fragmentation and destabilization. The corporate political activity of the food industry could be identified and monitored through publicly available data sourced from the industry itself, governments, the media and other sources. Steps for country-level monitoring include identification of key food industry actors and related sources of information, followed by systematic data collection and analysis of relevant documents, using the proposed framework as a basis for classification of results. The proposed monitoring approach should be pilot tested in different countries as part of efforts to increase the transparency and accountability of the food industry. This approach has the potential to help redress any imbalance of interests and thereby contribute to the prevention and control of non-communicable diseases. © 2015 World Obesity.

  6. A patient-specific segmentation framework for longitudinal MR images of traumatic brain injury

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Prastawa, Marcel; Irimia, Andrei; Chambers, Micah C.; Vespa, Paul M.; Van Horn, John D.; Gerig, Guido

    2012-02-01

    Traumatic brain injury (TBI) is a major cause of death and disability worldwide. Robust, reproducible segmentations of MR images with TBI are crucial for quantitative analysis of recovery and treatment efficacy. However, this is a significant challenge due to severe anatomy changes caused by edema (swelling), bleeding, tissue deformation, skull fracture, and other effects related to head injury. In this paper, we introduce a multi-modal image segmentation framework for longitudinal TBI images. The framework is initialized through manual input of primary lesion sites at each time point, which are then refined by a joint approach composed of Bayesian segmentation and construction of a personalized atlas. The personalized atlas construction estimates the average of the posteriors of the Bayesian segmentation at each time point and warps the average back to each time point to provide the updated priors for Bayesian segmentation. The difference between our approach and segmenting longitudinal images independently is that we use the information from all time points to improve the segmentations. Given a manual initialization, our framework automatically segments healthy structures (white matter, grey matter, cerebrospinal fluid) as well as different lesions such as hemorrhagic lesions and edema. Our framework can handle different sets of modalities at each time point, which provides flexibility in analyzing clinical scans. We show results on three subjects with acute baseline scans and chronic follow-up scans. The results demonstrate that joint analysis of all the points yields improved segmentation compared to independent analysis of the two time points.

  7. A Holistic Theoretical Approach to Intellectual Disability: Going Beyond the Four Current Perspectives.

    PubMed

    Schalock, Robert L; Luckasson, Ruth; Tassé, Marc J; Verdugo, Miguel Angel

    2018-04-01

    This article describes a holistic theoretical framework that can be used to explain intellectual disability (ID) and organize relevant information into a usable roadmap to guide understanding and application. Developing the framework involved analyzing the four current perspectives on ID and synthesizing this information into a holistic theoretical framework. Practices consistent with the framework are described, and examples are provided of how multiple stakeholders can apply the framework. The article concludes with a discussion of the advantages and implications of a holistic theoretical approach to ID.

  8. Electronic Chemical Potentials of Porous Metal–Organic Frameworks

    PubMed Central

    2014-01-01

    The binding energy of an electron in a material is a fundamental characteristic, which determines a wealth of important chemical and physical properties. For metal–organic frameworks this quantity is hitherto unknown. We present a general approach for determining the vacuum level of porous metal–organic frameworks and apply it to obtain the first ionization energy for six prototype materials including zeolitic, covalent, and ionic frameworks. This approach for valence band alignment can explain observations relating to the electrochemical, optical, and electrical properties of porous frameworks. PMID:24447027

  9. The City Blueprint Approach: Urban Water Management and Governance in Cities in the U.S.

    NASA Astrophysics Data System (ADS)

    Feingold, Daniel; Koop, Stef; van Leeuwen, Kees

    2018-01-01

    In this paper, we assess the challenges of water, waste and climate change in six cities across the U.S.: New York City, Boston, Milwaukee, Phoenix, Portland and Los Angeles. We apply the City Blueprint® Approach which consists of three indicator assessments: (1) the Trends and Pressures Framework (TPF), (2) the City Blueprint Framework (CBF) and (3) the water Governance Capacity Framework (GCF). The TPF summarizes the main social, environmental and financial pressures that may impede water management. The CBF provides an integrated overview of the management performances within the urban watercycle. Finally, the GCF provides a framework to identify key barriers and opportunities to develop governance capacity. The GCF has only been applied in NYC. Results show that all cities face pressures from heat risk. The management performances regarding resource efficiency and resource recovery from wastewater and solid waste show considerable room for improvement. Moreover, stormwater separation, infrastructure maintenance and green space require improvement in order to achieve a resilient urban watercycle. Finally, in New York City, the GCF results show that learning through smart monitoring, evaluation and cross-stakeholder learning is a limiting condition that needs to be addressed. We conclude that the City Blueprint Approach has large potential to assist cities in their strategic planning and exchange of knowledge, experiences and lessons. Because the methodology is well-structured, easy to understand, and concise, it may bridge the gap between science, policy and practice. It could therefore enable other cities to address their challenges of water, waste and climate change.

  10. The City Blueprint Approach: Urban Water Management and Governance in Cities in the U.S.

    PubMed

    Feingold, Daniel; Koop, Stef; van Leeuwen, Kees

    2018-01-01

    In this paper, we assess the challenges of water, waste and climate change in six cities across the U.S.: New York City, Boston, Milwaukee, Phoenix, Portland and Los Angeles. We apply the City Blueprint ® Approach which consists of three indicator assessments: (1) the Trends and Pressures Framework (TPF), (2) the City Blueprint Framework (CBF) and (3) the water Governance Capacity Framework (GCF). The TPF summarizes the main social, environmental and financial pressures that may impede water management. The CBF provides an integrated overview of the management performances within the urban watercycle. Finally, the GCF provides a framework to identify key barriers and opportunities to develop governance capacity. The GCF has only been applied in NYC. Results show that all cities face pressures from heat risk. The management performances regarding resource efficiency and resource recovery from wastewater and solid waste show considerable room for improvement. Moreover, stormwater separation, infrastructure maintenance and green space require improvement in order to achieve a resilient urban watercycle. Finally, in New York City, the GCF results show that learning through smart monitoring, evaluation and cross-stakeholder learning is a limiting condition that needs to be addressed. We conclude that the City Blueprint Approach has large potential to assist cities in their strategic planning and exchange of knowledge, experiences and lessons. Because the methodology is well-structured, easy to understand, and concise, it may bridge the gap between science, policy and practice. It could therefore enable other cities to address their challenges of water, waste and climate change.

  11. An inverse modeling approach for semilunar heart valve leaflet mechanics: exploitation of tissue structure.

    PubMed

    Aggarwal, Ankush; Sacks, Michael S

    2016-08-01

    Determining the biomechanical behavior of heart valve leaflet tissues in a noninvasive manner remains an important clinical goal. While advances in 3D imaging modalities have made in vivo valve geometric data available, optimal methods to exploit such information in order to obtain functional information remain to be established. Herein we present and evaluate a novel leaflet shape-based framework to estimate the biomechanical behavior of heart valves from surface deformations by exploiting tissue structure. We determined accuracy levels using an "ideal" in vitro dataset, in which the leaflet geometry, strains, mechanical behavior, and fibrous structure were known to a high level of precision. By utilizing a simplified structural model for the leaflet mechanical behavior, we were able to limit the number of parameters to be determined per leaflet to only two. This approach allowed us to dramatically reduce the computational time and easily visualize the cost function to guide the minimization process. We determined that the image resolution and the number of available imaging frames were important components in the accuracy of our framework. Furthermore, our results suggest that it is possible to detect differences in fiber structure using our framework, thus allowing an opportunity to diagnose asymptomatic valve diseases and begin treatment at their early stages. Lastly, we observed good agreement of the final resulting stress-strain response when an averaged fiber architecture was used. This suggests that population-averaged fiber structural data may be sufficient for the application of the present framework to in vivo studies, although clearly much work remains to extend the present approach to in vivo problems.

  12. European Qualifications Framework: Weighing Some Pros and Cons out of a French Perspective

    ERIC Educational Resources Information Center

    Bouder, Annie

    2008-01-01

    Purpose: The purpose of this paper is to question the appropriateness of a proposal for a new European Qualifications Framework. The framework has three perspectives: historical; analytical; and national. Design/methodology/approach: The approaches are diverse since the first insists on the institutional and decision-making processes at European…

  13. A sampling framework for incorporating quantitative mass spectrometry data in protein interaction analysis.

    PubMed

    Tucker, George; Loh, Po-Ru; Berger, Bonnie

    2013-10-04

    Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.

  14. Foundations of Effective Influence Operations: A Framework for Enhancing Army Capabilities

    DTIC Science & Technology

    2009-01-01

    interesting approaches we came across in our survey of social science approaches that might be suitable for supporting influence operations. In many...planning, conducting, and assessing the impact of influence operations on attitudes and behaviors. The approach is based on survey instruments that...second to latitude. Influencing Individuals 21 To illustrate, Figure 2.1 presents, in a three-dimensional form, the results from a Galileo survey

  15. Predicting change in epistemological beliefs, reflective thinking and learning styles: a longitudinal study.

    PubMed

    Phan, Huy P

    2008-03-01

    Although extensive research has examined epistemological beliefs, reflective thinking and learning approaches, very few studies have looked at these three theoretical frameworks in their totality. This research tested two separate structural models of epistemological beliefs, learning approaches, reflective thinking and academic performance among tertiary students over a period of 12 months. Participants were first-year Arts (N=616; 271 females, 345 males) and second-year Mathematics (N=581; 241 females, 341 males) university students. Students' epistemological beliefs were measured with the Schommer epistemological questionnaire (EQ, Schommer, 1990). Reflective thinking was measured with the reflective thinking questionnaire (RTQ, Kember et al., 2000). Student learning approaches were measured with the revised study process questionnaire (R-SPQ-2F, Biggs, Kember, & Leung, 2001). LISREL 8 was used to test two structural equation models - the cross-lag model and the causal-mediating model. In the cross-lag model involving Arts students, structural equation modelling showed that epistemological beliefs influenced student learning approaches rather than the contrary. In the causal-mediating model involving Mathematics students, the results indicate that both epistemological beliefs and learning approaches predicted reflective thinking and academic performance. Furthermore, learning approaches mediated the effect of epistemological beliefs on reflective thinking and academic performance. Results of this study are significant as they integrated the three theoretical frameworks within the one study.

  16. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  17. A framework for the social valuation of ecosystem services.

    PubMed

    Felipe-Lucia, María R; Comín, Francisco A; Escalera-Reyes, Javier

    2015-05-01

    Methods to assess ecosystem services using ecological or economic approaches are considerably better defined than methods for the social approach. To identify why the social approach remains unclear, we reviewed current trends in the literature. We found two main reasons: (i) the cultural ecosystem services are usually used to represent the whole social approach, and (ii) the economic valuation based on social preferences is typically included in the social approach. Next, we proposed a framework for the social valuation of ecosystem services that provides alternatives to economics methods, enables comparison across studies, and supports decision-making in land planning and management. The framework includes the agreements emerged from the review, such as considering spatial-temporal flows, including stakeholders from all social ranges, and using two complementary methods to value ecosystem services. Finally, we provided practical recommendations learned from the application of the proposed framework in a case study.

  18. Comparative Human Health Impact Assessment of Engineered Nanomaterials in the Framework of Life Cycle Assessment.

    PubMed

    Fransman, Wouter; Buist, Harrie; Kuijpers, Eelco; Walser, Tobias; Meyer, David; Zondervan-van den Beuken, Esther; Westerhout, Joost; Klein Entink, Rinke H; Brouwer, Derk H

    2017-07-01

    For safe innovation, knowledge on potential human health impacts is essential. Ideally, these impacts are considered within a larger life-cycle-based context to support sustainable development of new applications and products. A methodological framework that accounts for human health impacts caused by inhalation of engineered nanomaterials (ENMs) in an indoor air environment has been previously developed. The objectives of this study are as follows: (i) evaluate the feasibility of applying the CF framework for NP exposure in the workplace based on currently available data; and (ii) supplement any resulting knowledge gaps with methods and data from the life cycle approach and human risk assessment (LICARA) project to develop a modified case-specific version of the framework that will enable near-term inclusion of NP human health impacts in life cycle assessment (LCA) using a case study involving nanoscale titanium dioxide (nanoTiO 2 ). The intent is to enhance typical LCA with elements of regulatory risk assessment, including its more detailed measure of uncertainty. The proof-of-principle demonstration of the framework highlighted the lack of available data for both the workplace emissions and human health effects of ENMs that is needed to calculate generalizable characterization factors using common human health impact assessment practices in LCA. The alternative approach of using intake fractions derived from workplace air concentration measurements and effect factors based on best-available toxicity data supported the current case-by-case approach for assessing the human health life cycle impacts of ENMs. Ultimately, the proposed framework and calculations demonstrate the potential utility of integrating elements of risk assessment with LCA for ENMs once the data are available. © 2016 Society for Risk Analysis.

  19. Geosites and geoheritage representations - a cartographic approach

    NASA Astrophysics Data System (ADS)

    Rocha, Joao; Brilha, José

    2016-04-01

    In recent years, the increasing awareness of the importance of nature conservation, particularly towards the protection, conservation and promotion of geological sites, has resulted in a wide range of scientific studies. In a certain way, the majority of geodiversity studies, geoconservation strategies and geosites inventories and geoheritage assessment projects will use, on a particular stage, a cartographic representation - a map - of the most relevant geological and geomorphological features within the area of analyses. A wide range of geosite maps and geological heritage maps have been produced but, so far, a widely accepted conceptual cartographic framework with a specific symbology for cartographic representation has not been created. In this work we debate the lack of a systematic and conceptual framework to support geoheritage and geosite mapping. It is important to create a widely accepted conceptual cartographic framework with a specific symbology to be used within maps dedicated to geoheritage and geosites. We propose a cartographic approach aiming the conceptualization and the definition of a nomenclature and symbology system to be used on both geosite and geoheritage maps. We define a symbology framework for geosite and geoheritage mapping addressed to general public and to secondary school students, in order to be used as geotouristic and didactic tools, respectively. Three different approaches to support the definition of the symbology framework were developed: i) symbols to correlate geosites with the geological time scale; ii) symbols related to each one of the 27 geological frameworks defined in the Portuguese geoheritage inventory; iii) symbols to represent groups of geosites that share common geological and geomorphological features. The use of these different symbols in a map allows a quick understanding of a set of relevant information, in addition to the usual geographical distribution of geosites in a certain area.

  20. An expanded conceptual framework for solution-focused management of chemical pollution in European waters.

    PubMed

    Munthe, John; Brorström-Lundén, Eva; Rahmberg, Magnus; Posthuma, Leo; Altenburger, Rolf; Brack, Werner; Bunke, Dirk; Engelen, Guy; Gawlik, Bernd Manfred; van Gils, Jos; Herráez, David López; Rydberg, Tomas; Slobodnik, Jaroslav; van Wezel, Annemarie

    2017-01-01

    This paper describes a conceptual framework for solutions-focused management of chemical contaminants built on novel and systematic approaches for identifying, quantifying and reducing risks of these substances. The conceptual framework was developed in interaction with stakeholders representing relevant authorities and organisations responsible for managing environmental quality of water bodies. Stakeholder needs were compiled via a survey and dialogue. The content of the conceptual framework was thereafter developed with inputs from relevant scientific disciplines. The conceptual framework consists of four access points: Chemicals, Environment, Abatement and Society, representing different aspects and approaches to engaging in the issue of chemical contamination of surface waters. It widens the scope for assessment and management of chemicals in comparison to a traditional (mostly) perchemical risk assessment approaches by including abatement- and societal approaches as optional solutions. The solution-focused approach implies an identification of abatement- and policy options upfront in the risk assessment process. The conceptual framework was designed for use in current and future chemical pollution assessments for the aquatic environment, including the specific challenges encountered in prioritising individual chemicals and mixtures, and is applicable for the development of approaches for safe chemical management in a broader sense. The four access points of the conceptual framework are interlinked by four key topics representing the main scientific challenges that need to be addressed, i.e.: identifying and prioritising hazardous chemicals at different scales; selecting relevant and efficient abatement options; providing regulatory support for chemicals management; predicting and prioritising future chemical risks. The conceptual framework aligns current challenges in the safe production and use of chemicals. The current state of knowledge and implementation of these challenges is described. The use of the conceptual framework, and addressing the challenges, is intended to support: (1) forwarding sustainable use of chemicals, (2) identification of pollutants of priority concern for cost-effective management, (3) the selection of optimal abatement options and (4) the development and use of optimised legal and policy instruments.

  1. PUBLIC AND PATIENT INVOLVEMENT IN HEALTH TECHNOLOGY ASSESSMENT: A FRAMEWORK FOR ACTION.

    PubMed

    Abelson, Julia; Wagner, Frank; DeJean, Deirdre; Boesveld, Sarah; Gauvin, Franςois-Pierre; Bean, Sally; Axler, Renata; Petersen, Stephen; Baidoobonso, Shamara; Pron, Gaylene; Giacomini, Mita; Lavis, John

    2016-01-01

    As health technology assessment (HTA) organizations in Canada and around the world seek to involve the public and patients in their activities, frameworks to guide decisions about whom to involve, through which mechanisms, and at what stages of the HTA process have been lacking. The aim of this study was to describe the development and outputs of a comprehensive framework for involving the public and patients in a government agency's HTA process. The framework was informed by a synthesis of international practice and published literature, a dialogue with local, national and international stakeholders, and the deliberations of a government agency's public engagement subcommittee in Ontario, Canada. The practice and literature synthesis failed to identify a single, optimal approach to involving the public and patients in HTA. Choice of methods should be considered in the context of each HTA stage, goals for incorporating societal and/or patient perspectives into the process, and relevant societal and/or patient values at stake. The resulting framework is structured around four actionable elements: (i) guiding principles and goals for public and patient involvement (PPI) in HTA, (ii) the establishment of a common language to support PPI efforts, (iii) a flexible array of PPI approaches, and (iv) on-going evaluation of PPI to inform adjustments over time. A public and patient involvement framework has been developed for implementation in a government agency's HTA process. Core elements of this framework may apply to other organizations responsible for HTA and health system quality improvement.

  2. A Function-Behavior-State Approach to Designing Human Machine Interface for Nuclear Power Plant Operators

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Zhang, W. J.

    2005-02-01

    This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.

  3. Mapping the Ethics of Translational Genomics: Situating Return of Results and Navigating the Research-Clinical Divide

    PubMed Central

    Wolf, Susan M.; Burke, Wylie; Koenig, Barbara A.

    2015-01-01

    Both bioethics and law have governed human genomics by distinguishing research from clinical practice. Yet the rise of translational genomics now makes this traditional dichotomy inadequate. This paper pioneers a new approach to the ethics of translational genomics. It maps the full range of ethical approaches needed, proposes a “layered” approach to determining the ethics framework for projects combining research and clinical care, and clarifies the key role that return of results can play in advancing translation. PMID:26479558

  4. Mapping the Ethics of Translational Genomics: Situating Return of Results and Navigating the Research-Clinical Divide.

    PubMed

    Wolf, Susan M; Burke, Wylie; Koenig, Barbara A

    2015-01-01

    Both bioethics and law have governed human genomics by distinguishing research from clinical practice. Yet the rise of translational genomics now makes this traditional dichotomy inadequate. This paper pioneers a new approach to the ethics of translational genomics. It maps the full range of ethical approaches needed, proposes a "layered" approach to determining the ethics framework for projects combining research and clinical care, and clarifies the key role that return of results can play in advancing translation. © 2015 American Society of Law, Medicine & Ethics, Inc.

  5. Advanced Information Technology in Simulation Based Life Cycle Design

    NASA Technical Reports Server (NTRS)

    Renaud, John E.

    2003-01-01

    In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.

  6. Integrating landscape system and meta-ecosystem frameworks to advance the understanding of ecosystem function in heterogeneous landscapes: An analysis on the carbon fluxes in the Northern Highlands Lake District (NHLD) of Wisconsin and Michigan.

    PubMed

    Yang, Haile; Chen, Jiakuan

    2018-01-01

    The successful integration of ecosystem ecology with landscape ecology would be conducive to understanding how landscapes function. There have been several attempts at this, with two main approaches: (1) an ecosystem-based approach, such as the meta-ecosystem framework and (2) a landscape-based approach, such as the landscape system framework. These two frameworks are currently disconnected. To integrate these two frameworks, we introduce a protocol, and then demonstrate application of the protocol using a case study. The protocol includes four steps: 1) delineating landscape systems; 2) classifying landscape systems; 3) adjusting landscape systems to meta-ecosystems and 4) integrating landscape system and meta-ecosystem frameworks through meta-ecosystems. The case study is the analyzing of the carbon fluxes in the Northern Highlands Lake District (NHLD) of Wisconsin and Michigan using this protocol. The application of this protocol revealed that one could follow this protocol to construct a meta-ecosystem and analyze it using the integrative framework of landscape system and meta-ecosystem frameworks. That is, one could (1) appropriately describe and analyze the spatial heterogeneity of the meta-ecosystem; (2) understand the emergent properties arising from spatial coupling of local ecosystems in the meta-ecosystem. In conclusion, this protocol is a useful approach for integrating the meta-ecosystem framework and the landscape system framework, which advances the describing and analyzing of the spatial heterogeneity and ecosystem function of interconnected ecosystems.

  7. Integrating landscape system and meta-ecosystem frameworks to advance the understanding of ecosystem function in heterogeneous landscapes: An analysis on the carbon fluxes in the Northern Highlands Lake District (NHLD) of Wisconsin and Michigan

    PubMed Central

    Chen, Jiakuan

    2018-01-01

    The successful integration of ecosystem ecology with landscape ecology would be conducive to understanding how landscapes function. There have been several attempts at this, with two main approaches: (1) an ecosystem-based approach, such as the meta-ecosystem framework and (2) a landscape-based approach, such as the landscape system framework. These two frameworks are currently disconnected. To integrate these two frameworks, we introduce a protocol, and then demonstrate application of the protocol using a case study. The protocol includes four steps: 1) delineating landscape systems; 2) classifying landscape systems; 3) adjusting landscape systems to meta-ecosystems and 4) integrating landscape system and meta-ecosystem frameworks through meta-ecosystems. The case study is the analyzing of the carbon fluxes in the Northern Highlands Lake District (NHLD) of Wisconsin and Michigan using this protocol. The application of this protocol revealed that one could follow this protocol to construct a meta-ecosystem and analyze it using the integrative framework of landscape system and meta-ecosystem frameworks. That is, one could (1) appropriately describe and analyze the spatial heterogeneity of the meta-ecosystem; (2) understand the emergent properties arising from spatial coupling of local ecosystems in the meta-ecosystem. In conclusion, this protocol is a useful approach for integrating the meta-ecosystem framework and the landscape system framework, which advances the describing and analyzing of the spatial heterogeneity and ecosystem function of interconnected ecosystems. PMID:29415066

  8. Addressing the need for an infection prevention and control framework that incorporates the role of surveillance: a discussion paper.

    PubMed

    Mitchell, Brett G; Gardner, Anne

    2014-03-01

    To present a discussion on theoretical frameworks in infection prevention and control. Infection prevention and control programmes have been in place for several years in response to the incidence of healthcare-associated infections and their associated morbidity and mortality. Theoretical frameworks play an important role in formalizing the understanding of infection prevention activities. Discussion paper. A literature search using electronic databases was conducted for published articles in English addressing theoretical frameworks in infection prevention and control between 1980-2012. Nineteen papers that included a reference to frameworks were identified in the review. A narrative analysis of these papers was completed. Two models were identified and neither included the role of surveillance. To reduce the risk of acquiring a healthcare-associated infection, a multifaceted approach to infection prevention is required. One key component in this approach is surveillance. The review identified two infection prevention and control frameworks, yet these are rarely applied in infection prevention and control programmes. Only one framework considered the multifaceted approach required for infection prevention. It did not, however, incorporate the role of surveillance. We present a framework that incorporates the role of surveillance into a biopsychosocial approach to infection prevention and control. Infection prevention and control programmes and associated research are led primarily by nurses. There is a need for an explicit infection prevention and control framework incorporating the important role that surveillance has in infection prevention activities. This study presents one framework for further critique and discussion. © 2013 John Wiley & Sons Ltd.

  9. Bridging the gap between regulatory acceptance and industry use of non-animal methods.

    PubMed

    Clippinger, Amy J; Hill, Erin; Curren, Rodger; Bishop, Patricia

    2016-01-01

    Collaboration between industry and regulators resulted in the development of a decision tree approach using in vitro or ex vivo assays to replace animal tests when determining the eye irritation potential of antimicrobial cleaning products (AMCPs) under the United States Environmental Protection Agency (EPA) Office of Pesticide Programs' hazard classification and labeling system. A policy document issued by the EPA in 2013 and updated in 2015 describes the alternate testing framework that industry could apply to new registrations of AMCPs and, on a case-by-case basis, to conventional pesticide products. Despite the collaborative effort, the availability of relevant non-animal methods, and the EPA's change in policy, only a limited number of AMCPs have been registered using the framework. Companies continue to conduct animal tests when registering AMCPs due to various challenges surrounding adoption of the new testing framework; however, recent discussions between industry, regulators, and other interested parties have identified ways these challenges may be overcome. In this article we explore how use of the alternate framework could be expanded through efforts such as increasing international harmonization, more proactively publicizing the framework, and enhancing the training of regulatory reviewers. Not only can these strategies help to increase use of the EPA alternate eye irritation framework, they can also be applied to facilitate the uptake of other alternative approaches to animal testing in the future.

  10. Capital update factor: a new era approaches.

    PubMed

    Grimaldi, P L

    1993-02-01

    The Health Care Financing Administration (HCFA) has constructed a preliminary model of a new capital update method which is consistent with the framework being developed to refine the update method for PPS operating costs. HCFA's eventual goal is to develop a single update framework for operating and capital costs. Initial results suggest that adopting the new capital update method would reduce capital payments substantially, which might intensify creditor's concerns about extending loans to hospitals.

  11. Theory of Change: a theory-driven approach to enhance the Medical Research Council's framework for complex interventions.

    PubMed

    De Silva, Mary J; Breuer, Erica; Lee, Lucy; Asher, Laura; Chowdhary, Neerja; Lund, Crick; Patel, Vikram

    2014-07-05

    The Medical Research Councils' framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions, thereby increasing the likelihood that the intervention will be ultimately effective, sustainable and scalable. We urge researchers developing and evaluating complex interventions to consider using this approach, to evaluate its usefulness and to build an evidence base to further refine the methodology. Clinical trials.gov: NCT02160249.

  12. A Framework for Developing the Structure of Public Health Economic Models.

    PubMed

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. Processing approaches to cognition: the impetus from the levels-of-processing framework.

    PubMed

    Roediger, Henry L; Gallo, David A; Geraci, Lisa

    2002-01-01

    Processing approaches to cognition have a long history, from act psychology to the present, but perhaps their greatest boost was given by the success and dominance of the levels-of-processing framework. We review the history of processing approaches, and explore the influence of the levels-of-processing approach, the procedural approach advocated by Paul Kolers, and the transfer-appropriate processing framework. Processing approaches emphasise the procedures of mind and the idea that memory storage can be usefully conceptualised as residing in the same neural units that originally processed information at the time of encoding. Processing approaches emphasise the unity and interrelatedness of cognitive processes and maintain that they can be dissected into separate faculties only by neglecting the richness of mental life. We end by pointing to future directions for processing approaches.

  14. Mesoporous Polymer Frameworks from End-Reactive Bottlebrush Copolymers

    DOE PAGES

    Altay, Esra; Nykypanchuk, Dmytro; Rzayev, Javid

    2017-08-07

    Reticulated nanoporous materials generated by versatile molecular framework approaches are limited to pore dimensions on the scale of the utilized rigid molecular building blocks (<5 nm). The inherent flexibility of linear polymers precludes their utilization as long framework connectors for the extension of this strategy to larger length scales. We report a method for the fabrication of mesoporous frameworks by using bottlebrush copolymers with reactive end blocks serving as rigid macromolecular interconnectors with directional reactivity. End-reactive bottlebrush copolymers with pendant alkene functionalities were synthesized by a combination of controlled radical polymerization and polymer modification protocols. Ru-catalyzed cross-metathesis cross-linking of bottlebrushmore » copolymers with two reactive end blocks resulted in the formation of polymer frameworks where isolated cross-linked domains were interconnected with bottlebrush copolymer bridges. The resulting materials were characterized by a continuous network pore structure with average pore sizes of 9–50 nm, conveniently tunable by the length of the utilized bottlebrush copolymer building blocks. As a result, the materials fabrication strategy described in this work expands the length scale of molecular framework materials and provides access to mesoporous polymers with a molecularly tunable reticulated pore structure without the need for templating, sacrificial component etching, or supercritical fluid drying.« less

  15. Gamifying Self-Management of Chronic Illnesses: A Mixed-Methods Study

    PubMed Central

    Wills, Gary; Ranchhod, Ashok

    2016-01-01

    Background Self-management of chronic illnesses is an ongoing issue in health care research. Gamification is a concept that arose in the field of computer science and has been borrowed by many other disciplines. It is perceived by many that gamification can improve the self-management experience of people with chronic illnesses. This paper discusses the validation of a framework (called The Wheel of Sukr) that was introduced to achieve this goal. Objective This research aims to (1) discuss a gamification framework targeting the self-management of chronic illnesses and (2) validate the framework by diabetic patients, medical professionals, and game experts. Methods A mixed-method approach was used to validate the framework. Expert interviews (N=8) were conducted in order to validate the themes of the framework. Additionally, diabetic participants completed a questionnaire (N=42) in order to measure their attitudes toward the themes of the framework. Results The results provide a validation of the framework. This indicates that gamification might improve the self-management of chronic illnesses, such as diabetes. Namely, the eight themes in the Wheel of Sukr (fun, esteem, socializing, self-management, self-representation, motivation, growth, sustainability) were perceived positively by 71% (30/42) of the participants with P value <.001. Conclusions In this research, both the interviews and the questionnaire yielded positive results that validate the framework (The Wheel of Sukr). Generally, this study indicates an overall acceptance of the notion of gamification in the self-management of diabetes. PMID:27612632

  16. Algorithm for lens calculations in the geometrized Maxwell theory

    NASA Astrophysics Data System (ADS)

    Kulyabov, Dmitry S.; Korolkova, Anna V.; Sevastianov, Leonid A.; Gevorkyan, Migran N.; Demidova, Anastasia V.

    2018-04-01

    Nowadays the geometric approach in optics is often used to find out media parameters based on propagation paths of the rays because in this case it is a direct problem. However inverse problem in the framework of geometrized optics is usually not given attention. The aim of this work is to demonstrate the work of the proposed the algorithm in the framework of geometrized approach to optics for solving the problem of finding the propagation path of the electromagnetic radiation depending on environmental parameters. The methods of differential geometry are used for effective metrics construction for isotropic and anisotropic media. For effective metric space ray trajectories are obtained in the form of geodesic curves. The introduced algorithm is applied to well-known objects, Maxwell and Luneburg lenses. The similarity of results obtained by classical and geometric approach is demonstrated.

  17. Development of a new approach to cumulative effects assessment: a northern river ecosystem example.

    PubMed

    Dubé, Monique; Johnson, Brian; Dunn, Gary; Culp, Joseph; Cash, Kevin; Munkittrick, Kelly; Wong, Isaac; Hedley, Kathlene; Booty, William; Lam, David; Resler, Oskar; Storey, Alex

    2006-02-01

    If sustainable development of Canadian waters is to be achieved, a realistic and manageable framework is required for assessing cumulative effects. The objective of this paper is to describe an approach for aquatic cumulative effects assessment that was developed under the Northern Rivers Ecosystem Initiative. The approach is based on a review of existing monitoring practices in Canada and the presence of existing thresholds for aquatic ecosystem health assessments. It suggests that a sustainable framework is possible for cumulative effects assessment of Canadian waters that would result in integration of national indicators of aquatic health, integration of national initiatives (e.g., water quality index, environmental effects monitoring), and provide an avenue where long-term monitoring programs could be integrated with baseline and follow-up monitoring conducted under the environmental assessment process.

  18. Analysis of eco-innovation with triple helix approach: case-study of biofloc catfish farming in Yogyakarta

    NASA Astrophysics Data System (ADS)

    Purwadi, D.; Nurlaily, I.

    2018-03-01

    Concerning environmental into focus of innovation process will expand the number of actor involved. Eco-innovation and triple helix are often frameworks applied to analyse how environmental concern are integrated in innovation process and how different stakeholder groups are having inter relation. Case study from biofloc catfish farming in Yogyakarta is presented to demonstrate a possible approach for researching the success of triple helix frameworks. This case is considered on basic of the result of a survey among farmers, academician and government. The paper concludes the creating of full triple helix encounters problem in practice. It also includes suggestion for further research on fisheries development.

  19. Basic research in evolution and ecology enhances forensics.

    PubMed

    Tomberlin, Jeffery K; Benbow, M Eric; Tarone, Aaron M; Mohr, Rachel M

    2011-02-01

    In 2009, the National Research Council recommended that the forensic sciences strengthen their grounding in basic empirical research to mitigate against criticism and improve accuracy and reliability. For DNA-based identification, this goal was achieved under the guidance of the population genetics community. This effort resulted in DNA analysis becoming the 'gold standard' of the forensic sciences. Elsewhere, we proposed a framework for streamlining research in decomposition ecology, which promotes quantitative approaches to collecting and applying data to forensic investigations involving decomposing human remains. To extend the ecological aspects of this approach, this review focuses on forensic entomology, although the framework can be extended to other areas of decomposition. Published by Elsevier Ltd.

  20. Knowledge acquisition in the fuzzy knowledge representation framework of a medical consultation system.

    PubMed

    Boegl, Karl; Adlassnig, Klaus-Peter; Hayashi, Yoichi; Rothenfluh, Thomas E; Leitich, Harald

    2004-01-01

    This paper describes the fuzzy knowledge representation framework of the medical computer consultation system MedFrame/CADIAG-IV as well as the specific knowledge acquisition techniques that have been developed to support the definition of knowledge concepts and inference rules. As in its predecessor system CADIAG-II, fuzzy medical knowledge bases are used to model the uncertainty and the vagueness of medical concepts and fuzzy logic reasoning mechanisms provide the basic inference processes. The elicitation and acquisition of medical knowledge from domain experts has often been described as the most difficult and time-consuming task in knowledge-based system development in medicine. It comes as no surprise that this is even more so when unfamiliar representations like fuzzy membership functions are to be acquired. From previous projects we have learned that a user-centered approach is mandatory in complex and ill-defined knowledge domains such as internal medicine. This paper describes the knowledge acquisition framework that has been developed in order to make easier and more accessible the three main tasks of: (a) defining medical concepts; (b) providing appropriate interpretations for patient data; and (c) constructing inferential knowledge in a fuzzy knowledge representation framework. Special emphasis is laid on the motivations for some system design and data modeling decisions. The theoretical framework has been implemented in a software package, the Knowledge Base Builder Toolkit. The conception and the design of this system reflect the need for a user-centered, intuitive, and easy-to-handle tool. First results gained from pilot studies have shown that our approach can be successfully implemented in the context of a complex fuzzy theoretical framework. As a result, this critical aspect of knowledge-based system development can be accomplished more easily.

  1. Automatized spleen segmentation in non-contrast-enhanced MR volume data using subject-specific shape priors

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Bülow, Robin; Völzke, Henry

    2017-07-01

    To develop the first fully automated 3D spleen segmentation framework derived from T1-weighted magnetic resonance (MR) imaging data and to verify its performance for spleen delineation and volumetry. This approach considers the issue of low contrast between spleen and adjacent tissue in non-contrast-enhanced MR images. Native T1-weighted MR volume data was performed on a 1.5 T MR system in an epidemiological study. We analyzed random subsamples of MR examinations without pathologies to develop and verify the spleen segmentation framework. The framework is modularized to include different kinds of prior knowledge into the segmentation pipeline. Classification by support vector machines differentiates between five different shape types in computed foreground probability maps and recognizes characteristic spleen regions in axial slices of MR volume data. A spleen-shape space generated by training produces subject-specific prior shape knowledge that is then incorporated into a final 3D level set segmentation method. Individually adapted shape-driven forces as well as image-driven forces resulting from refined foreground probability maps steer the level set successfully to the segment the spleen. The framework achieves promising segmentation results with mean Dice coefficients of nearly 0.91 and low volumetric mean errors of 6.3%. The presented spleen segmentation approach can delineate spleen tissue in native MR volume data. Several kinds of prior shape knowledge including subject-specific 3D prior shape knowledge can be used to guide segmentation processes achieving promising results.

  2. Primary Care Performance Measurement and Reporting at a Regional Level: Could a Matrix Approach Provide Actionable Information for Policy Makers and Clinicians?

    PubMed

    Langton, Julia M; Wong, Sabrina T; Johnston, Sharon; Abelson, Julia; Ammi, Mehdi; Burge, Fred; Campbell, John; Haggerty, Jeannie; Hogg, William; Wodchis, Walter P; McGrail, Kimberlyn

    2016-11-01

    Primary care services form the foundation of modern healthcare systems, yet the breadth and complexity of services and diversity of patient populations may present challenges for creating comprehensive primary care information systems. Our objective is to develop regional-level information on the performance of primary care in Canada. A scoping review was conducted to identify existing initiatives in primary care performance measurement and reporting across 11 countries. The results of this review were used by our international team of primary care researchers and clinicians to propose an approach for regional-level primary care reporting. We found a gap between conceptual primary care performance measurement frameworks in the peer-reviewed literature and real-world primary care performance measurement and reporting activities. We did not find a conceptual framework or analytic approach that could readily form the foundation of a regional-level primary care information system. Therefore, we propose an approach to reporting comprehensive and actionable performance information according to widely accepted core domains of primary care as well as different patient population groups. An approach that bridges the gap between conceptual frameworks and real-world performance measurement and reporting initiatives could address some of the potential pitfalls of existing ways of presenting performance information (i.e., by single diseases or by age). This approach could produce meaningful and actionable information on the quality of primary care services. Copyright © 2016 Longwoods Publishing.

  3. An expanded framework to define and measure shared decision-making in dialogue: A 'top-down' and 'bottom-up' approach.

    PubMed

    Callon, Wynne; Beach, Mary Catherine; Links, Anne R; Wasserman, Carly; Boss, Emily F

    2018-03-11

    We aimed to develop a comprehensive, descriptive framework to measure shared decision making (SDM) in clinical encounters. We combined a top-down (theoretical) approach with a bottom-up approach based on audio-recorded dialogue to identify all communication processes related to decision making. We coded 55 pediatric otolaryngology visits using the framework and report interrater reliability. We identified 14 clinician behaviors and 5 patient behaviors that have not been previously described, and developed a new SDM framework that is descriptive (what does happen) rather than normative (what should happen). Through the bottom-up approach we identified three broad domains not present in other SDM frameworks: socioemotional support, understandability of clinician dialogue, and recommendation-giving. We also specify the ways in which decision-making roles are assumed implicitly rather than discussed explicitly. Interrater reliability was >75% for 92% of the coded behaviors. This SDM framework allows for a more expansive understanding and analysis of how decision making takes place in clinical encounters, including new domains and behaviors not present in existing measures. We hope that this new framework will bring attention to a broader conception of SDM and allow researchers to further explore the new domains and behaviors identified. Copyright © 2018. Published by Elsevier B.V.

  4. Coordination Covalent Frameworks: A New Route for Synthesis and Expansion of Functional Porous Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elsaidi, Sameh K.; Mohamed, Mona H.; Loring, John S.

    The synthetic approaches for fine-tuning the structural properties of coordination polymers or metal organic frameworks have exponentially grown during the last decade. This is due to the control over the properties of the resulting structures such as stability, pore size, pore chemis-try and surface area for myriad possible applications. Herein, we present a new class of porous materials called Covalent Coordination Frameworks (CCFs) that were designed and effectively synthesized using a two-step reticular chemistry approach. During the first step, trigonal prismatic molecular building block was isolated using 4-aminobenazoic acid and Cr (III) salt, subsequently in the second step the polymerizationmore » of the isolated molecular building blocks (MBBs) takes place by the formation of strong covalent bonds where small organic molecules can connect the MBBs forming extended porous CCF materials. All the isolated CCFs were found to be permanently porous while the discrete MBB were non-porous. This approach would inevitably open a feasible path for the applications of reticular chemistry and the synthesis of novel porous materials with various topologies under ambient conditions using simple organic molecules and versatile MBBs with different functionalities which would not be possible using the traditional one step approach« less

  5. Mapping Informative Clusters in a Hierarchial Framework of fMRI Multivariate Analysis

    PubMed Central

    Xu, Rui; Zhen, Zonglei; Liu, Jia

    2010-01-01

    Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies. PMID:21152081

  6. When homogeneity meets heterogeneity: the geographically weighted regression with spatial lag approach to prenatal care utilization

    PubMed Central

    Shoff, Carla; Chen, Vivian Yi-Ju; Yang, Tse-Chuan

    2014-01-01

    Using geographically weighted regression (GWR), a recent study by Shoff and colleagues (2012) investigated the place-specific risk factors for prenatal care utilization in the US and found that most of the relationships between late or not prenatal care and its determinants are spatially heterogeneous. However, the GWR approach may be subject to the confounding effect of spatial homogeneity. The goal of this study is to address this concern by including both spatial homogeneity and heterogeneity into the analysis. Specifically, we employ an analytic framework where a spatially lagged (SL) effect of the dependent variable is incorporated into the GWR model, which is called GWR-SL. Using this innovative framework, we found evidence to argue that spatial homogeneity is neglected in the study by Shoff et al. (2012) and the results are changed after considering the spatially lagged effect of prenatal care utilization. The GWR-SL approach allows us to gain a place-specific understanding of prenatal care utilization in US counties. In addition, we compared the GWR-SL results with the results of conventional approaches (i.e., OLS and spatial lag models) and found that GWR-SL is the preferred modeling approach. The new findings help us to better estimate how the predictors are associated with prenatal care utilization across space, and determine whether and how the level of prenatal care utilization in neighboring counties matters. PMID:24893033

  7. The potential of using the Ecosystem Approach in the implementation of the EU Water Framework Directive.

    PubMed

    Vlachopoulou, M; Coughlin, D; Forrow, D; Kirk, S; Logan, P; Voulvoulis, N

    2014-02-01

    The Ecosystem Approach provides a framework for looking at whole ecosystems in decision making to ensure that society can maintain a healthy and resilient natural environment now and for future generations. Although not explicitly mentioned in the Water Framework Directive, the Ecosystem Approach appears to be a promising concept to help its implementation, on the basis that there is a connection between the aims and objectives of the Directive (including good ecological status) and the provision of ecosystem services. In this paper, methodological linkages between the Ecosystem Approach and the Water Framework Directive have been reviewed and a framework is proposed that links its implementation to the Ecosystem Approach taking into consideration all ecosystem services and water management objectives. Individual River Basin Management Plan objectives are qualitatively assessed as to how strong their link is with individual ecosystem services. The benefits of using this approach to provide a preliminary assessment of how it could support future implementation of the Directive have been identified and discussed. Findings also demonstrate its potential to encourage more systematic and systemic thinking as it can provide a consistent framework for identifying shared aims and evaluating alternative water management scenarios and options in decision making. Allowing for a broad consideration of the benefits, costs and tradeoffs that occur in each case, this approach can further improve the economic case for certain measures, and can also help restore the shift in focus from strict legislative compliance towards a more holistic implementation that can deliver the wider aims and intentions of the Directive. © 2013.

  8. Developing customer databases.

    PubMed

    Rao, S K; Shenbaga, S

    2000-01-01

    There is a growing consensus among pharmaceutical companies that more product and customer-specific approaches to marketing and selling a new drug can result in substantial increases in sales. Marketers and researchers taking a proactive micro-marketing approach to identifying, profiling, and communicating with target customers are likely to facilitate such approaches and outcomes. This article provides a working framework for creating customer databases that can be effectively mined to achieve a variety of such marketing and sales force objectives.

  9. Local linear discriminant analysis framework using sample neighbors.

    PubMed

    Fan, Zizhu; Xu, Yong; Zhang, David

    2011-07-01

    The linear discriminant analysis (LDA) is a very popular linear feature extraction approach. The algorithms of LDA usually perform well under the following two assumptions. The first assumption is that the global data structure is consistent with the local data structure. The second assumption is that the input data classes are Gaussian distributions. However, in real-world applications, these assumptions are not always satisfied. In this paper, we propose an improved LDA framework, the local LDA (LLDA), which can perform well without needing to satisfy the above two assumptions. Our LLDA framework can effectively capture the local structure of samples. According to different types of local data structure, our LLDA framework incorporates several different forms of linear feature extraction approaches, such as the classical LDA and principal component analysis. The proposed framework includes two LLDA algorithms: a vector-based LLDA algorithm and a matrix-based LLDA (MLLDA) algorithm. MLLDA is directly applicable to image recognition, such as face recognition. Our algorithms need to train only a small portion of the whole training set before testing a sample. They are suitable for learning large-scale databases especially when the input data dimensions are very high and can achieve high classification accuracy. Extensive experiments show that the proposed algorithms can obtain good classification results.

  10. DyKOSMap: A framework for mapping adaptation between biomedical knowledge organization systems.

    PubMed

    Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal

    2015-06-01

    Knowledge Organization Systems (KOS) and their associated mappings play a central role in several decision support systems. However, by virtue of knowledge evolution, KOS entities are modified over time, impacting mappings and potentially turning them invalid. This requires semi-automatic methods to maintain such semantic correspondences up-to-date at KOS evolution time. We define a complete and original framework based on formal heuristics that drives the adaptation of KOS mappings. Our approach takes into account the definition of established mappings, the evolution of KOS and the possible changes that can be applied to mappings. This study experimentally evaluates the proposed heuristics and the entire framework on realistic case studies borrowed from the biomedical domain, using official mappings between several biomedical KOSs. We demonstrate the overall performance of the approach over biomedical datasets of different characteristics and sizes. Our findings reveal the effectiveness in terms of precision, recall and F-measure of the suggested heuristics and methods defining the framework to adapt mappings affected by KOS evolution. The obtained results contribute and improve the quality of mappings over time. The proposed framework can adapt mappings largely automatically, facilitating thus the maintenance task. The implemented algorithms and tools support and minimize the work of users in charge of KOS mapping maintenance. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Approach for the Development of a Framework for the Identification of Activities of Daily Living Using Sensors in Mobile Devices.

    PubMed

    Pires, Ivan Miguel; Garcia, Nuno M; Pombo, Nuno; Flórez-Revuelta, Francisco; Spinsante, Susanna

    2018-02-21

    Sensors available on mobile devices allow the automatic identification of Activities of Daily Living (ADL). This paper describes an approach for the creation of a framework for the identification of ADL, taking into account several concepts, including data acquisition, data processing, data fusion, and pattern recognition. These concepts can be mapped onto different modules of the framework. The proposed framework should perform the identification of ADL without Internet connection, performing these tasks locally on the mobile device, taking in account the hardware and software limitations of these devices. The main purpose of this paper is to present a new approach for the creation of a framework for the recognition of ADL, analyzing the allowed sensors available in the mobile devices, and the existing methods available in the literature.

  12. Approach for the Development of a Framework for the Identification of Activities of Daily Living Using Sensors in Mobile Devices

    PubMed Central

    Pombo, Nuno

    2018-01-01

    Sensors available on mobile devices allow the automatic identification of Activities of Daily Living (ADL). This paper describes an approach for the creation of a framework for the identification of ADL, taking into account several concepts, including data acquisition, data processing, data fusion, and pattern recognition. These concepts can be mapped onto different modules of the framework. The proposed framework should perform the identification of ADL without Internet connection, performing these tasks locally on the mobile device, taking in account the hardware and software limitations of these devices. The main purpose of this paper is to present a new approach for the creation of a framework for the recognition of ADL, analyzing the allowed sensors available in the mobile devices, and the existing methods available in the literature. PMID:29466316

  13. Evaluation framework for 16 earmarked projects in Washington State

    DOT National Transportation Integrated Search

    2007-05-01

    This report documents the results of applying a previously developed, standardized approach for evaluating advanced traveler information systems (ATIS) projects to a much more diverse group of 16 intelligent transportation systems (ITS) projects. The...

  14. Attributes of innovations and approaches to scalability - lessons from a national program to extend the scope of practice of health professionals.

    PubMed

    Masso, Malcolm; Thompson, Cristina

    2016-01-01

    The context for the paper was the evaluation of a national program in Australia to investigate extended scopes of practice for health professionals (paramedics, physiotherapists, and nurses). The design of the evaluation involved a mixed-methods approach with multiple data sources. Four multidisciplinary models of extended scope of practice were tested over an 18-month period, involving 26 organizations, 224 health professionals, and 36 implementation sites. The evaluation focused on what could be learned to inform scaling up the extended scopes of practice on a national scale. The evaluation findings were used to develop a conceptual framework for use by clinicians, managers, and policy makers to determine appropriate strategies for scaling up effective innovations. Development of the framework was informed by the literature on the diffusion of innovations, particularly an understanding that certain attributes of innovations influence adoption. The framework recognizes the role played by three groups of stakeholders: evidence producers, evidence influencers, and evidence adopters. The use of the framework is illustrated with four case studies from the evaluation. The findings demonstrate how the scaling up of innovations can be influenced by three quite distinct approaches - letting adoption take place in an uncontrolled, unplanned, way; actively helping the process of adoption; or taking deliberate steps to ensure that adoption takes place. Development of the conceptual framework resulted in two sets of questions to guide decisions about scalability, one for those considering whether to adopt the innovation (evidence adopters), and the other for those trying to decide on the optimal strategy for dissemination (evidence influencers).

  15. A Framework for Hierarchical Perception-Action Learning Utilizing Fuzzy Reasoning.

    PubMed

    Windridge, David; Felsberg, Michael; Shaukat, Affan

    2013-02-01

    Perception-action (P-A) learning is an approach to cognitive system building that seeks to reduce the complexity associated with conventional environment-representation/action-planning approaches. Instead, actions are directly mapped onto the perceptual transitions that they bring about, eliminating the need for intermediate representation and significantly reducing training requirements. We here set out a very general learning framework for cognitive systems in which online learning of the P-A mapping may be conducted within a symbolic processing context, so that complex contextual reasoning can influence the P-A mapping. In utilizing a variational calculus approach to define a suitable objective function, the P-A mapping can be treated as an online learning problem via gradient descent using partial derivatives. Our central theoretical result is to demonstrate top-down modulation of low-level perceptual confidences via the Jacobian of the higher levels of a subsumptive P-A hierarchy. Thus, the separation of the Jacobian as a multiplying factor between levels within the objective function naturally enables the integration of abstract symbolic manipulation in the form of fuzzy deductive logic into the P-A mapping learning. We experimentally demonstrate that the resulting framework achieves significantly better accuracy than using P-A learning without top-down modulation. We also demonstrate that it permits novel forms of context-dependent multilevel P-A mapping, applying the mechanism in the context of an intelligent driver assistance system.

  16. ConnectViz: Accelerated Approach for Brain Structural Connectivity Using Delaunay Triangulation.

    PubMed

    Adeshina, A M; Hashim, R

    2016-03-01

    Stroke is a cardiovascular disease with high mortality and long-term disability in the world. Normal functioning of the brain is dependent on the adequate supply of oxygen and nutrients to the brain complex network through the blood vessels. Stroke, occasionally a hemorrhagic stroke, ischemia or other blood vessel dysfunctions can affect patients during a cerebrovascular incident. Structurally, the left and the right carotid arteries, and the right and the left vertebral arteries are responsible for supplying blood to the brain, scalp and the face. However, a number of impairment in the function of the frontal lobes may occur as a result of any decrease in the flow of the blood through one of the internal carotid arteries. Such impairment commonly results in numbness, weakness or paralysis. Recently, the concepts of brain's wiring representation, the connectome, was introduced. However, construction and visualization of such brain network requires tremendous computation. Consequently, previously proposed approaches have been identified with common problems of high memory consumption and slow execution. Furthermore, interactivity in the previously proposed frameworks for brain network is also an outstanding issue. This study proposes an accelerated approach for brain connectomic visualization based on graph theory paradigm using compute unified device architecture, extending the previously proposed SurLens Visualization and computer aided hepatocellular carcinoma frameworks. The accelerated brain structural connectivity framework was evaluated with stripped brain datasets from the Department of Surgery, University of North Carolina, Chapel Hill, USA. Significantly, our proposed framework is able to generate and extract points and edges of datasets, displays nodes and edges in the datasets in form of a network and clearly maps data volume to the corresponding brain surface. Moreover, with the framework, surfaces of the dataset were simultaneously displayed with the nodes and the edges. The framework is very efficient in providing greater interactivity as a way of representing the nodes and the edges intuitively, all achieved at a considerably interactive speed for instantaneous mapping of the datasets' features. Uniquely, the connectomic algorithm performed remarkably fast with normal hardware requirement specifications.

  17. ConnectViz: Accelerated approach for brain structural connectivity using Delaunay triangulation.

    PubMed

    Adeshina, A M; Hashim, R

    2015-02-06

    Stroke is a cardiovascular disease with high mortality and long-term disability in the world. Normal functioning of the brain is dependent on the adequate supply of oxygen and nutrients to the brain complex network through the blood vessels. Stroke, occasionally a hemorrhagic stroke, ischemia or other blood vessel dysfunctions can affect patients during a cerebrovascular incident. Structurally, the left and the right carotid arteries, and the right and the left vertebral arteries are responsible for supplying blood to the brain, scalp and the face. However, a number of impairment in the function of the frontal lobes may occur as a result of any decrease in the flow of the blood through one of the internal carotid arteries. Such impairment commonly results in numbness, weakness or paralysis. Recently, the concepts of brain's wiring representation, the connectome, was introduced. However, construction and visualization of such brain network requires tremendous computation. Consequently, previously proposed approaches have been identified with common problems of high memory consumption and slow execution. Furthermore, interactivity in the previously proposed frameworks for brain network is also an outstanding issue. This study proposes an accelerated approach for brain connectomic visualization based on graph theory paradigm using Compute Unified Device Architecture (CUDA), extending the previously proposed SurLens Visualization and Computer Aided Hepatocellular Carcinoma (CAHECA) frameworks. The accelerated brain structural connectivity framework was evaluated with stripped brain datasets from the Department of Surgery, University of North Carolina, Chapel Hill, United States. Significantly, our proposed framework is able to generates and extracts points and edges of datasets, displays nodes and edges in the datasets in form of a network and clearly maps data volume to the corresponding brain surface. Moreover, with the framework, surfaces of the dataset were simultaneously displayed with the nodes and the edges. The framework is very efficient in providing greater interactivity as a way of representing the nodes and the edges intuitively, all achieved at a considerably interactive speed for instantaneous mapping of the datasets' features. Uniquely, the connectomic algorithm performed remarkably fast with normal hardware requirement specifications.

  18. 'Governance of' and 'Governance by': implementing a clinical governance framework in an area mental health service.

    PubMed

    O'Connor, Nick; Paton, Michael

    2008-04-01

    A framework developed to promote the understanding and application of clinical governance principles in an area mental health service is described. The framework is operationalized through systems, processes, roles and responsibilities. The development of an explicit and operationalizable framework for clinical governance arose from the authors' experiences in leading and managing mental health services. There is a particular emphasis on improvement of quality of care and patient safety. The framework is informed by recent developments in thinking about clinical governance, including key documents from Australia and the United Kingdom. The operational nature of the framework allows for key components of clinical governance to be described explicitly, communicated effectively, and continually tested and improved. Further consideration and assessment of the value of differing approaches to this task are required. For example, a general, illustrative approach to raise clinician awareness can be contrasted with prescriptive and specified approaches which progressively encompass the many functions and processes of a mental health service. Mental health clinicians and managers can be guided by a framework that will ensure safe, high quality and continually improving processes of care.

  19. Meta-Synthetic Support Frameworks for Reuse of Government Information Resources on City Travel and Traffic: The Case of Beijing

    ERIC Educational Resources Information Center

    An, Xiaomi; Xu, Shaotong; Mu, Yong; Wang, Wei; Bai, Xian Yang; Dawson, Andy; Han, Hongqi

    2012-01-01

    Purpose: The purpose of this paper is to propose meta-synthetic ideas and knowledge asset management approaches to build a comprehensive strategic framework for Beijing City in China. Design/methodology/approach: Methods include a review of relevant literature in both English and Chinese, case studies of different types of support frameworks in…

  20. The Analysing Children's Creative Thinking Framework: Development of an Observation-Led Approach to Identifying and Analysing Young Children's Creative Thinking

    ERIC Educational Resources Information Center

    Robson, Sue

    2014-01-01

    Increased international recognition of the value of supporting creative thinking suggests the value of development of approaches to its identification in children. Development of an observation-led framework, the Analysing Children's Creative Thinking (ACCT) framework, is described, and a case made for the validity of inferring creative thinking…

  1. A Delphi approach to developing a core competency framework for family practice registered nurses in Ontario.

    PubMed

    Moaveni, Azadeh; Gallinaro, Anna; Conn, Lesley Gotlib; Callahan, Sheilagh; Hammond, Melanie; Oandasan, Ivy

    2010-12-01

    This paper describes the results of a Delphi panel process to gain consensus on a role description and competency framework for family practice registered nurses (FP-RNs) in Ontario. Based on the findings from interviews and focus groups with family practice registered nurses and their inter-professional colleagues throughout Ontario, a core competency framework for FP-RNs emerged consisting of six distinct roles - Professional, Expert, Communicator, Synergist, Health Educator and Lifelong Learner - with accompanying enabling competency statements. This framework was refined and validated by a panel of experts from various nursing and family medicine associations and organizations through a Delphi consensus process. This core competency framework for FP-RNs was developed as a stepping stone for clarifying this very important and poorly understood role in family practice. As a result of this research, we expect a greater acknowledgement of the contributions and expertise of the FP-RN as well as the need to celebrate and profile this role. This work has already led to the establishment of a network of stakeholders from nursing organizations in Ontario who are considering opportunities to move the development and use of the competency framework forward.

  2. How can we get close to zero? The potential contribution of biomedical prevention and the investment framework towards an effective response to HIV.

    PubMed

    Stover, John; Hallett, Timothy B; Wu, Zunyou; Warren, Mitchell; Gopalappa, Chaitra; Pretorius, Carel; Ghys, Peter D; Montaner, Julio; Schwartländer, Bernhard

    2014-01-01

    In 2011 an Investment Framework was proposed that described how the scale-up of key HIV interventions could dramatically reduce new HIV infections and deaths in low and middle income countries by 2015. This framework included ambitious coverage goals for prevention and treatment services resulting in a reduction of new HIV infections by more than half. However, it also estimated a leveling in the number of new infections at about 1 million annually after 2015. We modeled how the response to AIDS can be further expanded by scaling up antiretroviral treatment (ART) within the framework provided by the 2013 WHO treatment guidelines. We further explored the potential contributions of new prevention technologies: 'Test and Treat', pre-exposure prophylaxis and an HIV vaccine. Immediate aggressive scale up of existing approaches including the 2013 WHO guidelines could reduce new infections by 80%. A 'Test and Treat' approach could further reduce new infections. This could be further enhanced by a future highly effective pre-exposure prophylaxis and an HIV vaccine, so that a combination of all four approaches could reduce new infections to as low as 80,000 per year by 2050 and annual AIDS deaths to 260,000. In a set of ambitious scenarios, we find that immediate implementation of the 2013 WHO antiretroviral therapy guidelines could reduce new HIV infections by 80%. Further reductions may be achieved by moving to a 'Test and Treat' approach, and eventually by adding a highly effective pre-exposure prophylaxis and an HIV vaccine, if they become available.

  3. Deductive Evaluation: Implicit Code Verification With Low User Burden

    NASA Technical Reports Server (NTRS)

    Di Vito, Ben L.

    2016-01-01

    We describe a framework for symbolically evaluating C code using a deductive approach that discovers and proves program properties. The framework applies Floyd-Hoare verification principles in its treatment of loops, with a library of iteration schemes serving to derive loop invariants. During evaluation, theorem proving is performed on-the-fly, obviating the generation of verification conditions normally needed to establish loop properties. A PVS-based prototype is presented along with results for sample C functions.

  4. Graphene/graphene-tube nanocomposites templated from cage-containing metal-organic frameworks for oxygen reduction in Li-O₂ batteries.

    PubMed

    Li, Qing; Xu, Ping; Gao, Wei; Ma, Shuguo; Zhang, Guoqi; Cao, Ruiguo; Cho, Jaephil; Wang, Hsing-Lin; Wu, Gang

    2014-03-05

    Nitrogen-doped graphene/graphene-tube nanocomposites are prepared by a hightemperature approach using a newly designed cage-containing metal-organic framework (MOF) to template nitrogen/carbon (dicyandiamide) and iron precursors. The resulting N-Fe-MOF catalysts universally exhibit high oxygen-reduction activity in acidic, alkaline, and non-aqueous electrolytes and superior cathode performance in Li-O2 batteries. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Ensemble Semi-supervised Frame-work for Brain Magnetic Resonance Imaging Tissue Segmentation.

    PubMed

    Azmi, Reza; Pishgoo, Boshra; Norozi, Narges; Yeganeh, Samira

    2013-04-01

    Brain magnetic resonance images (MRIs) tissue segmentation is one of the most important parts of the clinical diagnostic tools. Pixel classification methods have been frequently used in the image segmentation with two supervised and unsupervised approaches up to now. Supervised segmentation methods lead to high accuracy, but they need a large amount of labeled data, which is hard, expensive, and slow to obtain. Moreover, they cannot use unlabeled data to train classifiers. On the other hand, unsupervised segmentation methods have no prior knowledge and lead to low level of performance. However, semi-supervised learning which uses a few labeled data together with a large amount of unlabeled data causes higher accuracy with less trouble. In this paper, we propose an ensemble semi-supervised frame-work for segmenting of brain magnetic resonance imaging (MRI) tissues that it has been used results of several semi-supervised classifiers simultaneously. Selecting appropriate classifiers has a significant role in the performance of this frame-work. Hence, in this paper, we present two semi-supervised algorithms expectation filtering maximization and MCo_Training that are improved versions of semi-supervised methods expectation maximization and Co_Training and increase segmentation accuracy. Afterward, we use these improved classifiers together with graph-based semi-supervised classifier as components of the ensemble frame-work. Experimental results show that performance of segmentation in this approach is higher than both supervised methods and the individual semi-supervised classifiers.

  6. Framework for e-learning assessment in dental education: a global model for the future.

    PubMed

    Arevalo, Carolina R; Bayne, Stephen C; Beeley, Josie A; Brayshaw, Christine J; Cox, Margaret J; Donaldson, Nora H; Elson, Bruce S; Grayden, Sharon K; Hatzipanagos, Stylianos; Johnson, Lynn A; Reynolds, Patricia A; Schönwetter, Dieter J

    2013-05-01

    The framework presented in this article demonstrates strategies for a global approach to e-curricula in dental education by considering a collection of outcome assessment tools. By combining the outcomes for overall assessment, a global model for a pilot project that applies e-assessment tools to virtual learning environments (VLE), including haptics, is presented. Assessment strategies from two projects, HapTEL (Haptics in Technology Enhanced Learning) and UDENTE (Universal Dental E-learning), act as case-user studies that have helped develop the proposed global framework. They incorporate additional assessment tools and include evaluations from questionnaires and stakeholders' focus groups. These measure each of the factors affecting the classical teaching/learning theory framework as defined by Entwistle in a standardized manner. A mathematical combinatorial approach is proposed to join these results together as a global assessment. With the use of haptic-based simulation learning, exercises for tooth preparation assessing enamel and dentine were compared to plastic teeth in manikins. Equivalence for student performance for haptic versus traditional preparation methods was established, thus establishing the validity of the haptic solution for performing these exercises. Further data collected from HapTEL are still being analyzed, and pilots are being conducted to validate the proposed test measures. Initial results have been encouraging, but clearly the need persists to develop additional e-assessment methods for new learning domains.

  7. Assessing Students' Perceptions of Campus Community: A Focus Group Approach. Professional File. Number 95, Spring 2005

    ERIC Educational Resources Information Center

    Cheng, David X.

    2005-01-01

    This paper offers a focus group approach to the understanding of student perceptions of campus community. Using the Strange and Banning (2001) framework of community, the author argues that students' sense of campus community should be studied as it exists within the institutional environment. The results of the study include: 1) There is a strong…

  8. Using a Systematic Approach and Theoretical Framework to Design a Curriculum for the Shaping Healthy Choices Program.

    PubMed

    Linnell, Jessica D; Zidenberg-Cherr, Sheri; Briggs, Marilyn; Scherr, Rachel E; Brian, Kelley M; Hillhouse, Carol; Smith, Martin H

    2016-01-01

    To examine the use of a systematic approach and theoretical framework to develop an inquiry-based, garden-enhanced nutrition curriculum for the Shaping Healthy Choices Program. Curriculum development occurred in 3 steps: identification of learning objectives, determination of evidence of learning, and activity development. Curriculum activities were further refined through pilot-testing, which was conducted in 2 phases. Formative data collected during pilot-testing resulted in improvements to activities. Using a systematic, iterative process resulted in a curriculum called Discovering Healthy Choices, which has a strong foundation in Social Cognitive Theory and constructivist learning theory. Furthermore, the Backward Design method provided the design team with a systematic approach to ensure activities addressed targeted learning objectives and overall Shaping Healthy Choices Program goals. The process by which a nutrition curriculum is developed may have a direct effect on student outcomes. Processes by which nutrition curricula are designed and learning objectives are selected, and how theory and pedagogy are applied should be further investigated so that effective approaches to developing garden-enhanced nutrition interventions can be determined and replicated. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  9. An approach to multiscale modelling with graph grammars.

    PubMed

    Ong, Yongzhi; Streit, Katarína; Henke, Michael; Kurth, Winfried

    2014-09-01

    Functional-structural plant models (FSPMs) simulate biological processes at different spatial scales. Methods exist for multiscale data representation and modification, but the advantages of using multiple scales in the dynamic aspects of FSPMs remain unclear. Results from multiscale models in various other areas of science that share fundamental modelling issues with FSPMs suggest that potential advantages do exist, and this study therefore aims to introduce an approach to multiscale modelling in FSPMs. A three-part graph data structure and grammar is revisited, and presented with a conceptual framework for multiscale modelling. The framework is used for identifying roles, categorizing and describing scale-to-scale interactions, thus allowing alternative approaches to model development as opposed to correlation-based modelling at a single scale. Reverse information flow (from macro- to micro-scale) is catered for in the framework. The methods are implemented within the programming language XL. Three example models are implemented using the proposed multiscale graph model and framework. The first illustrates the fundamental usage of the graph data structure and grammar, the second uses probabilistic modelling for organs at the fine scale in order to derive crown growth, and the third combines multiscale plant topology with ozone trends and metabolic network simulations in order to model juvenile beech stands under exposure to a toxic trace gas. The graph data structure supports data representation and grammar operations at multiple scales. The results demonstrate that multiscale modelling is a viable method in FSPM and an alternative to correlation-based modelling. Advantages and disadvantages of multiscale modelling are illustrated by comparisons with single-scale implementations, leading to motivations for further research in sensitivity analysis and run-time efficiency for these models.

  10. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  11. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. FEAST fundamental framework for electronic structure calculations: Reformulation and solution of the muffin-tin problem

    NASA Astrophysics Data System (ADS)

    Levin, Alan R.; Zhang, Deyin; Polizzi, Eric

    2012-11-01

    In a recent article Polizzi (2009) [15], the FEAST algorithm has been presented as a general purpose eigenvalue solver which is ideally suited for addressing the numerical challenges in electronic structure calculations. Here, FEAST is presented beyond the “black-box” solver as a fundamental modeling framework which can naturally address the original numerical complexity of the electronic structure problem as formulated by Slater in 1937 [3]. The non-linear eigenvalue problem arising from the muffin-tin decomposition of the real-space domain is first derived and then reformulated to be solved exactly within the FEAST framework. This new framework is presented as a fundamental and practical solution for performing both accurate and scalable electronic structure calculations, bypassing the various issues of using traditional approaches such as linearization and pseudopotential techniques. A finite element implementation of this FEAST framework along with simulation results for various molecular systems is also presented and discussed.

  13. A Latent Class Approach to Estimating Test-Score Reliability

    ERIC Educational Resources Information Center

    van der Ark, L. Andries; van der Palm, Daniel W.; Sijtsma, Klaas

    2011-01-01

    This study presents a general framework for single-administration reliability methods, such as Cronbach's alpha, Guttman's lambda-2, and method MS. This general framework was used to derive a new approach to estimating test-score reliability by means of the unrestricted latent class model. This new approach is the latent class reliability…

  14. A Hybrid Approach for Supporting Adaptivity in E-Learning Environments

    ERIC Educational Resources Information Center

    Al-Omari, Mohammad; Carter, Jenny; Chiclana, Francisco

    2016-01-01

    Purpose: The purpose of this paper is to identify a framework to support adaptivity in e-learning environments. The framework reflects a novel hybrid approach incorporating the concept of the event-condition-action (ECA) model and intelligent agents. Moreover, a system prototype is developed reflecting the hybrid approach to supporting adaptivity…

  15. Sustainability assessment framework for scenarios – SAFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arushanyan, Yevgeniya, E-mail: yevgeniya.arushanyan@abe.kth.se; KTH Royal Institute of Technology, Centre for Sustainable Communications; Ekener, Elisabeth

    To address current challenges regarding sustainable development and support planning for this form of development, new learning about different possible futures and their potential sustainability implications is needed. One way of facilitating this learning is by combining the futures studies and sustainability assessment (SA) research fields. This paper presents the sustainability assessment framework for scenarios (SAFS), a method developed for assessing the environmental and social risks and opportunities of future scenarios, provides guidelines for its application and demonstrates how the framework can be applied. SAFS suggests assessing environmental and social aspects using a consumption perspective and a life cycle approach,more » and provides qualitative results. SAFS does not suggest any modelling using precise data, but instead offers guidelines on how to carry out a qualitative assessment, where both the process of assessing and the outcome of the assessment are valuable and can be used as a basis for discussion. The benefits, drawbacks and potential challenges of applying SAFS are also discussed in the paper. SAFS uses systems thinking looking at future societies as a whole, considering both environmental and social consequences. This encourages researchers and decision-makers to consider the whole picture, and not just individual elements, when considering different futures. - Highlights: • The paper presents a new methodological framework for qualitative sustainability assessment of future scenarios with transformative changes. • The framework suggests qualitative assessment with consumption perspective and a life cycle approach. • The paper presents the framework and provides guidelines for its application. • The paper demonstrates on an example how the framework can be applied. • The benefits, drawbacks and challenges of the framework application and the need for further development are discussed.« less

  16. A Framework for Characterizing eHealth Literacy Demands and Barriers

    PubMed Central

    Chan, Connie V

    2011-01-01

    Background Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. Objective We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. Methods We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. Results The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. Conclusions The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum. PMID:22094891

  17. Annotating images by mining image search results.

    PubMed

    Wang, Xin-Jing; Zhang, Lei; Li, Xirong; Ma, Wei-Ying

    2008-11-01

    Although it has been studied for years by the computer vision and machine learning communities, image annotation is still far from practical. In this paper, we propose a novel attempt at model-free image annotation, which is a data-driven approach that annotates images by mining their search results. Some 2.4 million images with their surrounding text are collected from a few photo forums to support this approach. The entire process is formulated in a divide-and-conquer framework where a query keyword is provided along with the uncaptioned image to improve both the effectiveness and efficiency. This is helpful when the collected data set is not dense everywhere. In this sense, our approach contains three steps: 1) the search process to discover visually and semantically similar search results, 2) the mining process to identify salient terms from textual descriptions of the search results, and 3) the annotation rejection process to filter out noisy terms yielded by Step 2. To ensure real-time annotation, two key techniques are leveraged-one is to map the high-dimensional image visual features into hash codes, the other is to implement it as a distributed system, of which the search and mining processes are provided as Web services. As a typical result, the entire process finishes in less than 1 second. Since no training data set is required, our approach enables annotating with unlimited vocabulary and is highly scalable and robust to outliers. Experimental results on both real Web images and a benchmark image data set show the effectiveness and efficiency of the proposed algorithm. It is also worth noting that, although the entire approach is illustrated within the divide-and conquer framework, a query keyword is not crucial to our current implementation. We provide experimental results to prove this.

  18. A Flexible Socioeconomic Scenarios Framework for the Study of Plausible Arctic Futures

    NASA Astrophysics Data System (ADS)

    Reissell, A. K.; Peters, G. P.; Riahi, K.; Kroglund, M.; Lovecraft, A. L.; Nilsson, A. E.; Preston, B. L.; van Ruijven, B. J.

    2016-12-01

    Future developments of the Arctic region are associated with different drivers of change - climate, environmental, and socio-economic - and their interactions, and are highly uncertain. The uncertainty poses challenges for decision-making, calling for development of new analytical frameworks. Scenarios - coherent narratives describing potential futures, pathways to futures, and drivers of change along the way - can be used to explore the consequences of the key uncertainties, particularly in the long-term. In a participatory scenarios workshop, we used both top-down and bottom-up approaches for the development of a flexible socioeconomic scenarios framework. The top-down approach was linked to the global Integrated Assessment Modeling framework and its Shared Socio-Economic Pathways (SSPs), developing an Arctic extension of the set of five storylines on the main socioeconomic uncertainties in global climate change research. The bottom-up approach included participatory development of narratives originating from within the Arctic region. For extension of global SSPs to the regional level, we compared the key elements in the global SSPs (Population, Human Development, Economy & Lifestyle, Policies & Institutions, Technology, and Environment & Natural Resources) and key elements in the Arctic. Additional key elements for the Arctic scenarios include, for example, seasonal migration, the large role of traditional knowledge and culture, mixed economy, nested governance structure, human and environmental security, quality of infrastructure. The bottom-up derived results suggested that the scenarios developed independent of the SSPs could be mapped back to the SSPs to demonstrate consistency with respect to representing similar boundary conditions. The two approaches are complimentary, as the top-down approach can be used to set the global socio-economic and climate boundary conditions, and the bottom-up approach providing the regional context. One key uncertainty and driving force is the demand for resources (global or regional) that was mapped against the role of governance as well as adaptive and transformative capacity among actors within the Arctic. Resources demand has significant influence on the society, culture, economy and environment of the Arctic.

  19. Development of framework for sustainable Lean implementation: an ISM approach

    NASA Astrophysics Data System (ADS)

    Jadhav, Jagdish Rajaram; Mantha, S. S.; Rane, Santosh B.

    2014-07-01

    The survival of any organization depends upon its competitive edge. Even though Lean is one of the most powerful quality improvement methodologies, nearly two-thirds of the Lean implementations results in failures and less than one-fifth of those implemented have sustained results. One of the most significant tasks of top management is to identify, understand and deploy the significant Lean practices like quality circle, Kanban, Just-in-time purchasing, etc. The term `bundle' is used to make groups of inter-related and internally consistent Lean practices. Eight significant Lean practice bundles have been identified based on literature reviewed and opinion of the experts. The order of execution of Lean practice bundles is very important. Lean practitioners must be able to understand the interrelationship between these practice bundles. The objective of this paper is to develop framework for sustainable Lean implementation using interpretive structural modelling approach.

  20. Digital Storytelling Promoting Twenty-First Century Skills and Student Engagement

    ERIC Educational Resources Information Center

    Niemi, Hannele; Multisilta, Jari

    2016-01-01

    This article presents results on how students became engaged and motivated when using digital storytelling in knowledge creation in Finland, Greece and California. The theoretical framework is based on sociocultural theories. Learning is seen as a result of dialogical interactions between people, substances and artefacts. This approach has been…

  1. Network Approach to Disease Diagnosis

    NASA Astrophysics Data System (ADS)

    Sharma, Amitabh; Bashan, Amir; Barabasi, Alber-Laszlo

    2014-03-01

    Human diseases could be viewed as perturbations of the underlying biological system. A thorough understanding of the topological and dynamical properties of the biological system is crucial to explain the mechanisms of many complex diseases. Recently network-based approaches have provided a framework for integrating multi-dimensional biological data that results in a better understanding of the pathophysiological state of complex diseases. Here we provide a network-based framework to improve the diagnosis of complex diseases. This framework is based on the integration of transcriptomics and the interactome. We analyze the overlap between the differentially expressed (DE) genes and disease genes (DGs) based on their locations in the molecular interaction network (''interactome''). Disease genes and their protein products tend to be much more highly connected than random, hence defining a disease sub-graph (called disease module) in the interactome. DE genes, even though different from the known set of DGs, may be significantly associated with the disease when considering their closeness to the disease module in the interactome. This new network approach holds the promise to improve the diagnosis of patients who cannot be diagnosed using conventional tools. Support was provided by HL066289 and HL105339 grants from the U.S. National Institutes of Health.

  2. Using a service sector segmented approach to identify community stakeholders who can improve access to suicide prevention services for veterans.

    PubMed

    Matthieu, Monica M; Gardiner, Giovanina; Ziegemeier, Ellen; Buxton, Miranda

    2014-04-01

    Veterans in need of social services may access many different community agencies within the public and private sectors. Each of these settings has the potential to be a pipeline for attaining needed health, mental health, and benefits services; however, many service providers lack information on how to conceptualize where Veterans go for services within their local community. This article describes a conceptual framework for outreach that uses a service sector segmented approach. This framework was developed to aid recruitment of a provider-based sample of stakeholders (N = 70) for a study on improving access to the Department of Veterans Affairs and community-based suicide prevention services. Results indicate that although there are statistically significant differences in the percent of Veterans served by the different service sectors (F(9, 55) = 2.71, p = 0.04), exposure to suicidal Veterans and providers' referral behavior is consistent across the sectors. Challenges to using this framework include isolating the appropriate sectors for targeted outreach efforts. The service sector segmented approach holds promise for identifying and referring at-risk Veterans in need of services. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  3. Social Organization, Population, and Land Use*

    PubMed Central

    Axinn, William G.; Ghimire, Dirgha J.

    2011-01-01

    We present a new approach to the investigation of human influences on environmental change that explicitly adds consideration of social organization. This approach identifies social organization as an influence on the environment that is independent of population size, affluence, and technology. The framework we present also identifies population events, such as births, that are likely to influence environmental outcomes beyond the consequences of population size. The theoretical framework we construct explains that explicit attention to social organization is necessary for micro-level investigation of the population-environment relationship because social organization influences both. We use newly available longitudinal, multilevel, mixed-method measures of local land use changes, local population dynamics, and social organization from the Nepalese Himalayas to provide empirical tests of this new framework. These tests reveal that measures of change in social organization are strongly associated with measures of change in land use, and that the association is independent of common measures of population size, affluence, and technology. Also, local birth events shape local land use changes and key proximate determinants of land use change. Together the empirical results demonstrate key new scientific opportunities arising from the approach we present. PMID:21876607

  4. Distributed memory parallel Markov random fields using graph partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinemann, C.; Perciano, T.; Ushizima, D.

    Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less

  5. 'I am an Intensive Guy': The Possibility and Conditions of Reconciliation Through the Ecological Intensification Framework.

    PubMed

    Levain, Alix; Vertès, Françoise; Ruiz, Laurent; Delaby, Luc; Gascuel-Odoux, Chantal; Barbier, Marc

    2015-11-01

    The need for better conciliation between food production and environmental protection calls for new conceptual approaches in agronomy. Ecological intensification (EI) is one of the most encouraging and successful conceptual frameworks for designing more sustainable agricultural systems, though relying upon semantic ambivalences and epistemic tensions. This article discusses abilities and limits of the EI framework in the context of strong social and environmental pressure for agricultural transition. The purpose is thus to put EI at stake in the light of the results of an interdisciplinary and participatory research project that explicitly adopted EI goals in livestock semi-industrialized farming systems. Is it possible to maintain livestock production systems that are simultaneously productive, sustainable, and viable and have low nitrate emissions in vulnerable coastal areas? If so, how do local stakeholders use these approaches? The main steps of the innovation process are described. The effects of political and social dynamics on the continuity of the transition process are analyzed, with a reflexive approach. This experiment invites one to consider that making EI operational in a context of socio-technical transition toward agroecology represents system innovation, requiring on-going dialogue, reflexivity, and long-term involvement by researchers.

  6. `I am an Intensive Guy': The Possibility and Conditions of Reconciliation Through the Ecological Intensification Framework

    NASA Astrophysics Data System (ADS)

    Levain, Alix; Vertès, Françoise; Ruiz, Laurent; Delaby, Luc; Gascuel-Odoux, Chantal; Barbier, Marc

    2015-11-01

    The need for better conciliation between food production and environmental protection calls for new conceptual approaches in agronomy. Ecological intensification (EI) is one of the most encouraging and successful conceptual frameworks for designing more sustainable agricultural systems, though relying upon semantic ambivalences and epistemic tensions. This article discusses abilities and limits of the EI framework in the context of strong social and environmental pressure for agricultural transition. The purpose is thus to put EI at stake in the light of the results of an interdisciplinary and participatory research project that explicitly adopted EI goals in livestock semi-industrialized farming systems. Is it possible to maintain livestock production systems that are simultaneously productive, sustainable, and viable and have low nitrate emissions in vulnerable coastal areas? If so, how do local stakeholders use these approaches? The main steps of the innovation process are described. The effects of political and social dynamics on the continuity of the transition process are analyzed, with a reflexive approach. This experiment invites one to consider that making EI operational in a context of socio-technical transition toward agroecology represents system innovation, requiring on-going dialogue, reflexivity, and long-term involvement by researchers.

  7. Operational framework for quantum measurement simulability

    NASA Astrophysics Data System (ADS)

    Guerini, Leonardo; Bavaresco, Jessica; Terra Cunha, Marcelo; Acín, Antonio

    2017-09-01

    We introduce a framework for simulating quantum measurements based on classical processing of a set of accessible measurements. Well-known concepts such as joint measurability and projective simulability naturally emerge as particular cases of our framework, but our study also leads to novel results and questions. First, a generalisation of joint measurability is derived, which yields a hierarchy for the incompatibility of sets of measurements. A similar hierarchy is defined based on the number of outcomes necessary to perform a simulation of a given measurement. This general approach also allows us to identify connections between different kinds of simulability and, in particular, we characterise the qubit measurements that are projective-simulable in terms of joint measurability. Finally, we discuss how our framework can be interpreted in the context of resource theories.

  8. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    PubMed Central

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  9. Depth Reconstruction from Single Images Using a Convolutional Neural Network and a Condition Random Field Model.

    PubMed

    Liu, Dan; Liu, Xuejun; Wu, Yiguang

    2018-04-24

    This paper presents an effective approach for depth reconstruction from a single image through the incorporation of semantic information and local details from the image. A unified framework for depth acquisition is constructed by joining a deep Convolutional Neural Network (CNN) and a continuous pairwise Conditional Random Field (CRF) model. Semantic information and relative depth trends of local regions inside the image are integrated into the framework. A deep CNN network is firstly used to automatically learn a hierarchical feature representation of the image. To get more local details in the image, the relative depth trends of local regions are incorporated into the network. Combined with semantic information of the image, a continuous pairwise CRF is then established and is used as the loss function of the unified model. Experiments on real scenes demonstrate that the proposed approach is effective and that the approach obtains satisfactory results.

  10. A new framework for comprehensive, robust, and efficient global sensitivity analysis: 2. Application

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2016-01-01

    Based on the theoretical framework for sensitivity analysis called "Variogram Analysis of Response Surfaces" (VARS), developed in the companion paper, we develop and implement a practical "star-based" sampling strategy (called STAR-VARS), for the application of VARS to real-world problems. We also develop a bootstrap approach to provide confidence level estimates for the VARS sensitivity metrics and to evaluate the reliability of inferred factor rankings. The effectiveness, efficiency, and robustness of STAR-VARS are demonstrated via two real-data hydrological case studies (a 5-parameter conceptual rainfall-runoff model and a 45-parameter land surface scheme hydrology model), and a comparison with the "derivative-based" Morris and "variance-based" Sobol approaches are provided. Our results show that STAR-VARS provides reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being 1-2 orders of magnitude more efficient than the Morris or Sobol approaches.

  11. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit their revised answers electronically. Students in the TRAD group were not granted access to the CLCS material and followed their normal classroom routine. At the end of the study, both the CLCS and TRAD students took a post-test. Questions on the post-test were divided into "what" questions, "how" questions, and an open response question. Analysis of students' post-test performance showed mixed results. While the TRAD students scored higher on the "what" questions, the CLCS students scored higher on the "how" questions and the one open response questions. This result suggested that more TRAD students knew what kinds of conditions may or may not cause electromagnetic induction without understanding how electromagnetic induction works. Analysis of the CLCS students' learning also suggested that frequent disruption and technical trouble might pose threats to the effectiveness of the CLCS learning framework. Despite the mixed results of students' post-test performance, the CLCS learning framework revealed some limitations to promote conceptual understanding in physics. Improvement can be made by providing students with background knowledge necessary to understand model reasoning and incorporating the CLCS learning framework with other learning frameworks to promote integration of various physics concepts. In addition, the reflective questions in the CLCS learning framework may be refined to better address students' difficulties. Limitations of the study, as well as suggestions for future research, are also presented in this study.

  12. Infant Expressions in an Approach/Withdrawal Framework

    PubMed Central

    Sullivan, Margaret Wolan

    2014-01-01

    Since the introduction of empirical methods for studying facial expression, the interpretation of infant facial expressions has generated much debate. The premise of this paper is that action tendencies of approach and withdrawal constitute a core organizational feature of emotion in humans, promoting coherence of behavior, facial signaling and physiological responses. The approach/withdrawal framework can provide a taxonomy of contexts and the neurobehavioral framework for the systematic, empirical study of individual differences in expression, physiology, and behavior within individuals as well as across contexts over time. By adopting this framework in developmental work on basic emotion processes, it may be possible to better understand the behavioral principles governing facial displays, and how individual differences in them are related to physiology and behavior, function in context. PMID:25412273

  13. LIPID11: A Modular Framework for Lipid Simulations using Amber

    PubMed Central

    Skjevik, Åge A.; Madej, Benjamin D.; Walker, Ross C.; eigen, Knut T

    2013-01-01

    Accurate simulation of complex lipid bilayers has long been a goal in condensed phase molecular dynamics (MD). Structure and function of membrane-bound proteins are highly dependent on the lipid bilayer environment and are challenging to study through experimental methods. Within Amber, there has been limited focus on lipid simulations, although some success has been seen with the use of the General Amber Force Field (GAFF). However, to date there are no dedicated Amber lipid force fields. In this paper we describe a new charge derivation strategy for lipids consistent with the Amber RESP approach, and a new atom and residue naming and type convention. In the first instance, we have combined this approach with GAFF parameters. The result is LIPID11, a flexible, modular framework for the simulation of lipids that is fully compatible with the existing Amber force fields. The charge derivation procedure, capping strategy and nomenclature for LIPID11, along with preliminary simulation results and a discussion of the planned long-term parameter development are presented here. Our findings suggest that Lipid11 is a modular framework feasible for phospholipids and a flexible starting point for the development of a comprehensive, Amber-compatible lipid force field. PMID:22916730

  14. A Scalable Framework For Segmenting Magnetic Resonance Images

    PubMed Central

    Hore, Prodip; Goldgof, Dmitry B.; Gu, Yuhua; Maudsley, Andrew A.; Darkazanli, Ammar

    2009-01-01

    A fast, accurate and fully automatic method of segmenting magnetic resonance images of the human brain is introduced. The approach scales well allowing fast segmentations of fine resolution images. The approach is based on modifications of the soft clustering algorithm, fuzzy c-means, that enable it to scale to large data sets. Two types of modifications to create incremental versions of fuzzy c-means are discussed. They are much faster when compared to fuzzy c-means for medium to extremely large data sets because they work on successive subsets of the data. They are comparable in quality to application of fuzzy c-means to all of the data. The clustering algorithms coupled with inhomogeneity correction and smoothing are used to create a framework for automatically segmenting magnetic resonance images of the human brain. The framework is applied to a set of normal human brain volumes acquired from different magnetic resonance scanners using different head coils, acquisition parameters and field strengths. Results are compared to those from two widely used magnetic resonance image segmentation programs, Statistical Parametric Mapping and the FMRIB Software Library (FSL). The results are comparable to FSL while providing significant speed-up and better scalability to larger volumes of data. PMID:20046893

  15. On the Use of CAD and Cartesian Methods for Aerodynamic Optimization

    NASA Technical Reports Server (NTRS)

    Nemec, M.; Aftosmis, M. J.; Pulliam, T. H.

    2004-01-01

    The objective for this paper is to present the development of an optimization capability for Curt3D, a Cartesian inviscid-flow analysis package. We present the construction of a new optimization framework and we focus on the following issues: 1) Component-based geometry parameterization approach using parametric-CAD models and CAPRI. A novel geometry server is introduced that addresses the issue of parallel efficiency while only sparingly consuming CAD resources; 2) The use of genetic and gradient-based algorithms for three-dimensional aerodynamic design problems. The influence of noise on the optimization methods is studied. Our goal is to create a responsive and automated framework that efficiently identifies design modifications that result in substantial performance improvements. In addition, we examine the architectural issues associated with the deployment of a CAD-based approach in a heterogeneous parallel computing environment that contains both CAD workstations and dedicated compute engines. We demonstrate the effectiveness of the framework for a design problem that features topology changes and complex geometry.

  16. A Novel Strategy Using Factor Graphs and the Sum-Product Algorithm for Satellite Broadcast Scheduling Problems

    NASA Astrophysics Data System (ADS)

    Chen, Jung-Chieh

    This paper presents a low complexity algorithmic framework for finding a broadcasting schedule in a low-altitude satellite system, i. e., the satellite broadcast scheduling (SBS) problem, based on the recent modeling and computational methodology of factor graphs. Inspired by the huge success of the low density parity check (LDPC) codes in the field of error control coding, in this paper, we transform the SBS problem into an LDPC-like problem through a factor graph instead of using the conventional neural network approaches to solve the SBS problem. Based on a factor graph framework, the soft-information, describing the probability that each satellite will broadcast information to a terminal at a specific time slot, is exchanged among the local processing in the proposed framework via the sum-product algorithm to iteratively optimize the satellite broadcasting schedule. Numerical results show that the proposed approach not only can obtain optimal solution but also enjoys the low complexity suitable for integral-circuit implementation.

  17. CONFU: Configuration Fuzzing Testing Framework for Software Vulnerability Detection

    PubMed Central

    Dai, Huning; Murphy, Christian; Kaiser, Gail

    2010-01-01

    Many software security vulnerabilities only reveal themselves under certain conditions, i.e., particular configurations and inputs together with a certain runtime environment. One approach to detecting these vulnerabilities is fuzz testing. However, typical fuzz testing makes no guarantees regarding the syntactic and semantic validity of the input, or of how much of the input space will be explored. To address these problems, we present a new testing methodology called Configuration Fuzzing. Configuration Fuzzing is a technique whereby the configuration of the running application is mutated at certain execution points, in order to check for vulnerabilities that only arise in certain conditions. As the application runs in the deployment environment, this testing technique continuously fuzzes the configuration and checks “security invariants” that, if violated, indicate a vulnerability. We discuss the approach and introduce a prototype framework called ConFu (CONfiguration FUzzing testing framework) for implementation. We also present the results of case studies that demonstrate the approach’s feasibility and evaluate its performance. PMID:21037923

  18. Filtering Based Adaptive Visual Odometry Sensor Framework Robust to Blurred Images

    PubMed Central

    Zhao, Haiying; Liu, Yong; Xie, Xiaojia; Liao, Yiyi; Liu, Xixi

    2016-01-01

    Visual odometry (VO) estimation from blurred image is a challenging problem in practical robot applications, and the blurred images will severely reduce the estimation accuracy of the VO. In this paper, we address the problem of visual odometry estimation from blurred images, and present an adaptive visual odometry estimation framework robust to blurred images. Our approach employs an objective measure of images, named small image gradient distribution (SIGD), to evaluate the blurring degree of the image, then an adaptive blurred image classification algorithm is proposed to recognize the blurred images, finally we propose an anti-blurred key-frame selection algorithm to enable the VO robust to blurred images. We also carried out varied comparable experiments to evaluate the performance of the VO algorithms with our anti-blur framework under varied blurred images, and the experimental results show that our approach can achieve superior performance comparing to the state-of-the-art methods under the condition with blurred images while not increasing too much computation cost to the original VO algorithms. PMID:27399704

  19. Quantifying changes in water use and groundwater availability in a megacity using novel integrated systems modeling

    NASA Astrophysics Data System (ADS)

    Hyndman, D. W.; Xu, T.; Deines, J. M.; Cao, G.; Nagelkirk, R.; Viña, A.; McConnell, W.; Basso, B.; Kendall, A. D.; Li, S.; Luo, L.; Lupi, F.; Ma, D.; Winkler, J. A.; Yang, W.; Zheng, C.; Liu, J.

    2017-08-01

    Water sustainability in megacities is a growing challenge with far-reaching effects. Addressing sustainability requires an integrated, multidisciplinary approach able to capture interactions among hydrology, population growth, and socioeconomic factors and to reflect changes due to climate variability and land use. We developed a new systems modeling framework to quantify the influence of changes in land use, crop growth, and urbanization on groundwater storage for Beijing, China. This framework was then used to understand and quantify causes of observed decreases in groundwater storage from 1993 to 2006, revealing that the expansion of Beijing's urban areas at the expense of croplands has enhanced recharge while reducing water lost to evapotranspiration, partially ameliorating groundwater declines. The results demonstrate the efficacy of such a systems approach to quantify the impacts of changes in climate and land use on water sustainability for megacities, while providing a quantitative framework to improve mitigation and adaptation strategies that can help address future water challenges.

  20. An Approach to a Comprehensive Test Framework for Analysis and Evaluation of Text Line Segmentation Algorithms

    PubMed Central

    Brodic, Darko; Milivojevic, Dragan R.; Milivojevic, Zoran N.

    2011-01-01

    The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures. PMID:22164106

  1. An approach to a comprehensive test framework for analysis and evaluation of text line segmentation algorithms.

    PubMed

    Brodic, Darko; Milivojevic, Dragan R; Milivojevic, Zoran N

    2011-01-01

    The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures.

  2. A Machine Learning Framework for Plan Payment Risk Adjustment.

    PubMed

    Rose, Sherri

    2016-12-01

    To introduce cross-validation and a nonparametric machine learning framework for plan payment risk adjustment and then assess whether they have the potential to improve risk adjustment. 2011-2012 Truven MarketScan database. We compare the performance of multiple statistical approaches within a broad machine learning framework for estimation of risk adjustment formulas. Total annual expenditure was predicted using age, sex, geography, inpatient diagnoses, and hierarchical condition category variables. The methods included regression, penalized regression, decision trees, neural networks, and an ensemble super learner, all in concert with screening algorithms that reduce the set of variables considered. The performance of these methods was compared based on cross-validated R 2 . Our results indicate that a simplified risk adjustment formula selected via this nonparametric framework maintains much of the efficiency of a traditional larger formula. The ensemble approach also outperformed classical regression and all other algorithms studied. The implementation of cross-validated machine learning techniques provides novel insight into risk adjustment estimation, possibly allowing for a simplified formula, thereby reducing incentives for increased coding intensity as well as the ability of insurers to "game" the system with aggressive diagnostic upcoding. © Health Research and Educational Trust.

  3. Wasatch: An architecture-proof multiphysics development environment using a Domain Specific Language and graph theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saad, Tony; Sutherland, James C.

    To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less

  4. Wasatch: An architecture-proof multiphysics development environment using a Domain Specific Language and graph theory

    DOE PAGES

    Saad, Tony; Sutherland, James C.

    2016-05-04

    To address the coding and software challenges of modern hybrid architectures, we propose an approach to multiphysics code development for high-performance computing. This approach is based on using a Domain Specific Language (DSL) in tandem with a directed acyclic graph (DAG) representation of the problem to be solved that allows runtime algorithm generation. When coupled with a large-scale parallel framework, the result is a portable development framework capable of executing on hybrid platforms and handling the challenges of multiphysics applications. In addition, we share our experience developing a code in such an environment – an effort that spans an interdisciplinarymore » team of engineers and computer scientists.« less

  5. Palatini actions and quantum gravity phenomenology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olmo, Gonzalo J., E-mail: gonzalo.olmo@csic.es

    2011-10-01

    We show that an invariant an universal length scale can be consistently introduced in a generally covariant theory through the gravitational sector using the Palatini approach. The resulting theory is able to capture different aspects of quantum gravity phenomenology in a single framework. In particular, it is found that in this theory field excitations propagating with different energy-densities perceive different background metrics, which is a fundamental characteristic of the DSR and Rainbow Gravity approaches. We illustrate these properties with a particular gravitational model and explicitly show how the soccer ball problem is avoided in this framework. The isotropic and anisotropicmore » cosmologies of this model also avoid the big bang singularity by means of a big bounce.« less

  6. Using the Multimodal Approach as a Framework for Eclectic Counselor Education.

    ERIC Educational Resources Information Center

    Greenburg, Sharon L.

    1982-01-01

    Shows how counselor educators may use the multimodal approach of Lazarus to provide students with a conceptual framework for integrating and evaluating various counseling theories and making systematic choices about appropriate interventions for their clients. (Author)

  7. Integrating gene and protein expression data with genome-scale metabolic networks to infer functional pathways.

    PubMed

    Pey, Jon; Valgepea, Kaspar; Rubio, Angel; Beasley, John E; Planes, Francisco J

    2013-12-08

    The study of cellular metabolism in the context of high-throughput -omics data has allowed us to decipher novel mechanisms of importance in biotechnology and health. To continue with this progress, it is essential to efficiently integrate experimental data into metabolic modeling. We present here an in-silico framework to infer relevant metabolic pathways for a particular phenotype under study based on its gene/protein expression data. This framework is based on the Carbon Flux Path (CFP) approach, a mixed-integer linear program that expands classical path finding techniques by considering additional biophysical constraints. In particular, the objective function of the CFP approach is amended to account for gene/protein expression data and influence obtained paths. This approach is termed integrative Carbon Flux Path (iCFP). We show that gene/protein expression data also influences the stoichiometric balancing of CFPs, which provides a more accurate picture of active metabolic pathways. This is illustrated in both a theoretical and real scenario. Finally, we apply this approach to find novel pathways relevant in the regulation of acetate overflow metabolism in Escherichia coli. As a result, several targets which could be relevant for better understanding of the phenomenon leading to impaired acetate overflow are proposed. A novel mathematical framework that determines functional pathways based on gene/protein expression data is presented and validated. We show that our approach is able to provide new insights into complex biological scenarios such as acetate overflow in Escherichia coli.

  8. Single-particle dynamics of the Anderson model: a local moment approach

    NASA Astrophysics Data System (ADS)

    Glossop, Matthew T.; Logan, David E.

    2002-07-01

    A non-perturbative local moment approach to single-particle dynamics of the general asymmetric Anderson impurity model is developed. The approach encompasses all energy scales and interaction strengths. It captures thereby strong coupling Kondo behaviour, including the resultant universal scaling behaviour of the single-particle spectrum; as well as the mixed valence and essentially perturbative empty orbital regimes. The underlying approach is physically transparent and innately simple, and as such is capable of practical extension to lattice-based models within the framework of dynamical mean-field theory.

  9. Group Problem Solving as a Different Participatory Approach to Citizenship Education

    ERIC Educational Resources Information Center

    Guérin, Laurence

    2017-01-01

    Purpose: The main goal of this article is to learning define and justify group problem solving as an approach to citizenship education. It is demonstrated that the choice of theoretical framework of democracy has consequences for the chosen learning goals, educational approach and learning activities. The framework used here is an epistemic theory…

  10. Computation of elementary modes: a unifying framework and the new binary approach

    PubMed Central

    Gagneur, Julien; Klamt, Steffen

    2004-01-01

    Background Metabolic pathway analysis has been recognized as a central approach to the structural analysis of metabolic networks. The concept of elementary (flux) modes provides a rigorous formalism to describe and assess pathways and has proven to be valuable for many applications. However, computing elementary modes is a hard computational task. In recent years we assisted in a multiplication of algorithms dedicated to it. We require a summarizing point of view and a continued improvement of the current methods. Results We show that computing the set of elementary modes is equivalent to computing the set of extreme rays of a convex cone. This standard mathematical representation provides a unified framework that encompasses the most prominent algorithmic methods that compute elementary modes and allows a clear comparison between them. Taking lessons from this benchmark, we here introduce a new method, the binary approach, which computes the elementary modes as binary patterns of participating reactions from which the respective stoichiometric coefficients can be computed in a post-processing step. We implemented the binary approach in FluxAnalyzer 5.1, a software that is free for academics. The binary approach decreases the memory demand up to 96% without loss of speed giving the most efficient method available for computing elementary modes to date. Conclusions The equivalence between elementary modes and extreme ray computations offers opportunities for employing tools from polyhedral computation for metabolic pathway analysis. The new binary approach introduced herein was derived from this general theoretical framework and facilitates the computation of elementary modes in considerably larger networks. PMID:15527509

  11. An operational structured decision making framework for ...

    EPA Pesticide Factsheets

    Pressure to develop an operational framework for decision makers to employ the concepts of ecosystem goods and services for assessing changes to human well-being has been increasing since these concepts gained widespread notoriety after the Millennium Ecosystem Assessment Report. Many conceptual frameworks have been proposed, but most do not propose methodologies and tools to make this approach to decision making implementable. Building on common components of existing conceptual frameworks for ecosystem services and human well-being assessment we apply a structured decision making approach to develop a standardized operational framework and suggest tools and methods for completing each step. The structured decision making approach consists of six steps: 1) Clarify the Decision Context 2) Define Objectives and Evaluation Criteria 3) Develop Alternatives 4) Estimate Consequences 5) Evaluate Trade-Offs and Select and 6) Implement and Monitor. These six steps include the following activities, and suggested tools, when applied to ecosystem goods and services and human well-being conceptual frameworks: 1) Characterization of decision specific human beneficiaries using the Final Ecosystem Goods and Services (FEGS) approach and Classification System (FEGS-CS) 2) Determine beneficiaries’ relative priorities for human well-being domains in the Human Well-Being Index (HWBI) through stakeholder engagement and identify beneficiary-relevant metrics of FEGS using the Nat

  12. Health information systems: a survey of frameworks for developing countries.

    PubMed

    Marcelo, A B

    2010-01-01

    The objective of this paper is to perform a survey of excellent research on health information systems (HIS) analysis and design, and their underlying theoretical frameworks. It classifies these frameworks along major themes, and analyzes the different approaches to HIS development that are practical in resource-constrained environments. Literature review based on PubMed citations and conference proceedings, as well as Internet searches on information systems in general, and health information systems in particular. The field of health information systems development has been studied extensively. Despite this, failed implementations are still common. Theoretical frameworks for HIS development are available that can guide implementers. As awareness, acceptance, and demand for health information systems increase globally, the variety of approaches and strategies will also follow. For developing countries with scarce resources, a trial-and-error approach can be very costly. Lessons from the successes and failures of initial HIS implementations have been abstracted into theoretical frameworks. These frameworks organize complex HIS concepts into methodologies that standardize techniques in implementation. As globalization continues to impact healthcare in the developing world, demand for more responsive health systems will become urgent. More comprehensive frameworks and practical tools to guide HIS implementers will be imperative.

  13. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  14. How Organizations Can Enhance the Quality of Life of Their Clients and Assess Their Results: The Concept of QOL Enhancement

    ERIC Educational Resources Information Center

    Reinders, Hans S.; Schalock, Robert L.

    2014-01-01

    This article presents the framework of a dynamic approach to quality of life (QOL) enhancement based on the conceptualization and measurement of individual-referenced quality of life. Sections of the article summarize the premises of QOL enhancement, provide the rationale for a dynamic approach to QOL enhancement, discuss six components of QOL…

  15. Model and Interoperability using Meta Data Annotations

    NASA Astrophysics Data System (ADS)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside. While providing all those capabilities, a significant reduction in the size of the model source code was achieved. To support the benefit of annotations for a modeler, studies were conducted to evaluate the effectiveness of an annotation based framework approach with other modeling frameworks and libraries, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A typical hydrological model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks.

  16. Conceptual Change from the Framework Theory Side of the Fence

    NASA Astrophysics Data System (ADS)

    Vosniadou, Stella; Skopeliti, Irini

    2014-07-01

    We describe the main principles of the framework theory approach to conceptual change and briefly report on the results of a text comprehension study that investigated some of the hypotheses that derive from it. We claim that children construct a naive physics which is based on observation in the context of lay culture and which forms a relatively coherent conceptual system—i.e., a framework theory—that can be used as a basis for explanation and prediction of everyday phenomena. Learning science requires fundamental ontological, epistemological, and representational changes in naive physics. These conceptual changes take a long time to be achieved, giving rise to fragmentation and synthetic conceptions. We also argue that both fragmentation and synthetic conceptions can be explained to result from learners' attempts assimilate scientific information into their existing but incompatible naive physics.

  17. Drinking water quality management: a holistic approach.

    PubMed

    Rizak, S; Cunliffe, D; Sinclair, M; Vulcano, R; Howard, J; Hrudey, S; Callan, P

    2003-01-01

    A growing list of water contaminants has led to some water suppliers relying primarily on compliance monitoring as a mechanism for managing drinking water quality. While such monitoring is a necessary part of drinking water quality management, experiences with waterborne disease threats and outbreaks have shown that compliance monitoring for numerical limits is not, in itself, sufficient to guarantee the safety and quality of drinking water supplies. To address these issues, the Australian National Health and Medical Research Council (NHMRC) has developed a Framework for Management of Drinking Water Quality (the Framework) for incorporation in the Australian Drinking Water Guidelines, the primary reference on drinking water quality in Australia. The Framework was developed specifically for drinking water supplies and provides a comprehensive and preventive risk management approach from catchment to consumer. It includes holistic guidance on a range of issues considered good practice for system management. The Framework addresses four key areas: Commitment to Drinking Water Quality Management, System Analysis and System Management, Supporting Requirements, and Review. The Framework represents a significantly enhanced approach to the management and regulation of drinking water quality and offers a flexible and proactive means of optimising drinking water quality and protecting public health. Rather than the primary reliance on compliance monitoring, the Framework emphasises prevention, the importance of risk assessment, maintaining the integrity of water supply systems and application of multiple barriers to assure protection of public health. Development of the Framework was undertaken in collaboration with the water industry, regulators and other stakeholder, and will promote a common and unified approach to drinking water quality management throughout Australia. The Framework has attracted international interest.

  18. Comparative analysis of nursing and midwifery regulatory and professional bodies' scope of practice and associated decision-making frameworks: a discussion paper.

    PubMed

    Kennedy, Catriona; O'Reilly, Pauline; Fealy, Gerard; Casey, Mary; Brady, Anne-Marie; McNamara, Martin; Prizeman, Geraldine; Rohde, Daniela; Hegarty, Josephine

    2015-08-01

    To review, discuss and compare nursing and midwifery regulatory and professional bodies' scope of practice and associated decision-making frameworks. Scope of practice in professional nursing and midwifery is an evolving process which needs to be responsive to clinical, service, societal, demographic and fiscal changes. Codes and frameworks offer a system of rules and principles by which the nursing and midwifery professions are expected to regulate members and demonstrate responsibility to society. Discussion paper. Twelve scope of practice and associated decision-making frameworks (January 2000-March 2014). Two main approaches to the regulation of the scope of practice and associated decision-making frameworks exist internationally. The first approach is policy and regulation driven and behaviour oriented. The second approach is based on notions of autonomous decision-making, professionalism and accountability. The two approaches are not mutually exclusive, but have similar elements with a different emphasis. Both approaches lack explicit recognition of the aesthetic aspects of care and patient choice, which is a fundamental principle of evidence-based practice. Nursing organizations, regulatory authorities and nurses should recognize that scope of practice and the associated responsibility for decision-making provides a very public statement about the status of nursing in a given jurisdiction. © 2015 John Wiley & Sons Ltd.

  19. [GRADE Evidence to Decision (EtD) frameworks: a systematic and transparent approach to making well informed healthcare choices. 2: Clinical practice guidelines.

    PubMed

    Morgano, Gian Paolo; Parmelli, Elena; Amato, Laura; Iannone, Primiano; Marchetti, Marco; Moja, Lorenzo; Davoli, Marina; Schünemann, Holger

    2018-05-01

    In the first article in this series we described the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Evidence to Decision (EtD) frameworks and their rationale for different types of decisions. In this second article, we describe the use of EtD frameworks for clinical recommendations and how it can help clinicians and patients who use those recommendations. EtD frameworks for clinical practice recommendations provide a structured and transparent approach for guideline panels. The framework helps ensure consideration of key criteria that determine whether an intervention should be recommended and that judgments are informed by the best available evidence. Frameworks are also a way for panels to make guideline users aware of the rationale (justification) for their recommendations.

  20. Attributes of innovations and approaches to scalability – lessons from a national program to extend the scope of practice of health professionals

    PubMed Central

    Masso, Malcolm; Thompson, Cristina

    2016-01-01

    The context for the paper was the evaluation of a national program in Australia to investigate extended scopes of practice for health professionals (paramedics, physiotherapists, and nurses). The design of the evaluation involved a mixed-methods approach with multiple data sources. Four multidisciplinary models of extended scope of practice were tested over an 18-month period, involving 26 organizations, 224 health professionals, and 36 implementation sites. The evaluation focused on what could be learned to inform scaling up the extended scopes of practice on a national scale. The evaluation findings were used to develop a conceptual framework for use by clinicians, managers, and policy makers to determine appropriate strategies for scaling up effective innovations. Development of the framework was informed by the literature on the diffusion of innovations, particularly an understanding that certain attributes of innovations influence adoption. The framework recognizes the role played by three groups of stakeholders: evidence producers, evidence influencers, and evidence adopters. The use of the framework is illustrated with four case studies from the evaluation. The findings demonstrate how the scaling up of innovations can be influenced by three quite distinct approaches – letting adoption take place in an uncontrolled, unplanned, way; actively helping the process of adoption; or taking deliberate steps to ensure that adoption takes place. Development of the conceptual framework resulted in two sets of questions to guide decisions about scalability, one for those considering whether to adopt the innovation (evidence adopters), and the other for those trying to decide on the optimal strategy for dissemination (evidence influencers). PMID:27616889

  1. The discipline of hospital development: a conceptual framework incorporating marketing, managerial, consumer behavior, and adult learning theories.

    PubMed

    Shirley, S; Stampfl, R

    1997-12-01

    The purpose of this explanatory and prescriptive article is to identify interdisciplinary theories used by hospital development to direct its practice. The article explores, explains, and applies theories and principles from behavioral, social, and managerial disciplines. Learning, motivational, organizational, marketing, and attitudinal theories are incorporated and transformed into the fundamental components of a conceptual framework that provides an overview of the practice of hospital development. How this discipline incorporates these theories to design, explain, and prescribe the focus of its own practice is demonstrated. This interdisciplinary approach results in a framework for practice that is adaptable to changing social, cultural, economic, political, and technological environments.

  2. Video copy protection and detection framework (VPD) for e-learning systems

    NASA Astrophysics Data System (ADS)

    ZandI, Babak; Doustarmoghaddam, Danial; Pour, Mahsa R.

    2013-03-01

    This Article reviews and compares the copyright issues related to the digital video files, which can be categorized as contended based and Digital watermarking copy Detection. Then we describe how to protect a digital video by using a special Video data hiding method and algorithm. We also discuss how to detect the copy right of the file, Based on expounding Direction of the technology of the video copy detection, and Combining with the own research results, brings forward a new video protection and copy detection approach in terms of plagiarism and e-learning systems using the video data hiding technology. Finally we introduce a framework for Video protection and detection in e-learning systems (VPD Framework).

  3. The Legacy Ecosystem Management Framework: From Theory to Application in the Detention Pond Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coty, J; Stevenson, M; Vogt, K A

    The Detention Pond is a constructed and lined storm water treatment basin at Lawrence Livermore National Laboratory that serves multiple stakeholder objectives and programmatic goals. This paper examines the process and outcome involved in the development of a new management plan for the Detention Pond. The plan was created using a new ecosystem management tool, the Legacy Framework. This stakeholder-driven conceptual framework provides an interdisciplinary methodology for determining ecosystem health, appropriate management strategies, and sensitive indicators. The conceptual framework, the Detention Ponds project, and the use of the framework in the context of the project, are described and evaluated, andmore » evaluative criteria for this and other ecosystem management frameworks are offered. The project benefited in several ways from use of the Legacy Framework, although refinements to the framework are suggested. The stakeholder process created a context and environment in which team members became receptive to using an ecosystem management approach to evaluate and support management alternatives previously not considered. This allowed for the unanimous agreement to pursue support from upper management and organizational funding to implement a progressive management strategy. The greatly improved stakeholder relations resulted in upper management support for the project.« less

  4. Applying systems thinking to inform studies of wildlife trade in primates.

    PubMed

    Blair, Mary E; Le, Minh D; Thạch, Hoàng M; Panariello, Anna; Vũ, Ngọc B; Birchette, Mark G; Sethi, Gautam; Sterling, Eleanor J

    2017-11-01

    Wildlife trade presents a major threat to primate populations, which are in demand from local to international scales for a variety of uses from food and traditional medicine to the exotic pet trade. We argue that an interdisciplinary framework to facilitate integration of socioeconomic, anthropological, and biological data across multiple spatial and temporal scales is essential to guide the study of wildlife trade dynamics and its impacts on primate populations. Here, we present a new way to design research on wildlife trade in primates using a systems thinking framework. We discuss how we constructed our framework, which follows a social-ecological system framework, to design an ongoing study of local, regional, and international slow loris (Nycticebus spp.) trade in Vietnam. We outline the process of iterative variable exploration and selection via this framework to inform study design. Our framework, guided by systems thinking, enables recognition of complexity in study design, from which the results can inform more holistic, site-appropriate, and effective trade management practices. We place our framework in the context of other approaches to studying wildlife trade and discuss options to address foreseeable challenges to implementing this new framework. © 2017 Wiley Periodicals, Inc.

  5. Effective integrated frameworks for assessing mining sustainability.

    PubMed

    Virgone, K M; Ramirez-Andreotta, M; Mainhagu, J; Brusseau, M L

    2018-05-28

    The objectives of this research are to review existing methods used for assessing mining sustainability, analyze the limited prior research that has evaluated the methods, and identify key characteristics that would constitute an enhanced sustainability framework that would serve to improve sustainability reporting in the mining industry. Five of the most relevant frameworks were selected for comparison in this analysis, and the results show that there are many commonalities among the five, as well as some disparities. In addition, relevant components are missing from all five. An enhanced evaluation system and framework were created to provide a more holistic, comprehensive method for sustainability assessment and reporting. The proposed framework has five components that build from and encompass the twelve evaluation characteristics used in the analysis. The components include Foundation, Focus, Breadth, Quality Assurance, and Relevance. The enhanced framework promotes a comprehensive, location-specific reporting approach with a concise set of well-defined indicators. Built into the framework is quality assurance, as well as a defined method to use information from sustainability reports to inform decisions. The framework incorporates human health and socioeconomic aspects via initiatives such as community-engaged research, economic valuations, and community-initiated environmental monitoring.

  6. Getting inside acupuncture trials - Exploring intervention theory and rationale

    PubMed Central

    2011-01-01

    Background Acupuncture can be described as a complex intervention. In reports of clinical trials the mechanism of acupuncture (that is, the process by which change is effected) is often left unstated or not known. This is problematic in assisting understanding of how acupuncture might work and in drawing together evidence on the potential benefits of acupuncture. Our aim was to aid the identification of the assumed mechanisms underlying the acupuncture interventions in clinical trials by developing an analytical framework to differentiate two contrasting approaches to acupuncture (traditional acupuncture and Western medical acupuncture). Methods Based on the principles of realist review, an analytical framework to differentiate these two contrasting approaches was developed. In order to see how useful the framework was in uncovering the theoretical rationale, it was applied to a set of trials of acupuncture for fatigue and vasomotor symptoms, identified from a wider literature review of acupuncture and early stage breast cancer. Results When examined for the degree to which a study demonstrated adherence to a theoretical model, two of the fourteen selected studies could be considered TA, five MA, with the remaining seven not fitting into any recognisable model. When examined by symptom, five of the nine vasomotor studies, all from one group of researchers, are arguably in the MA category, and two a TA model; in contrast, none of the five fatigue studies could be classed as either MA or TA and all studies had a weak rationale for the chosen treatment for fatigue. Conclusion Our application of the framework to the selected studies suggests that it is a useful tool to help uncover the therapeutic rationale of acupuncture interventions in clinical trials, for distinguishing between TA and MA approaches and for exploring issues of model validity. English language acupuncture trials frequently fail to report enough detail relating to the intervention. We advocate using this framework to aid reporting, along with further testing and refinement of the framework. PMID:21414187

  7. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  8. Examining the Value of a Scaffolded Critique Framework to Promote Argumentative and Explanatory Writings within an Argument-Based Inquiry Approach

    ERIC Educational Resources Information Center

    Jang, Jeong-yoon; Hand, Brian

    2017-01-01

    This study investigated the value of using a scaffolded critique framework to promote two different types of writing--argumentative writing and explanatory writing--with different purposes within an argument-based inquiry approach known as the Science Writing Heuristic (SWH) approach. A quasi-experimental design with sixth and seventh grade…

  9. The European Smoking Prevention Framework Approach (ESFA): Effects after 24 and 30 Months

    ERIC Educational Resources Information Center

    de Vries, Hein; Dijk, Froukje; Wetzels, Joyce; Mudde, Aart; Kremers, Stef; Ariza, Carles; Vitoria, Paulo Duarte; Fielder, Anne; Holm, Klavs; Janssen, Karin; Lehtovuori, Riku; Candel, Math

    2006-01-01

    The European Smoking Prevention Framework Approach (ESFA) study in six countries tested the effects of a comprehensive smoking prevention approach after 24 (T3; N = 10751) and 30 months (T4; N = 9282). The programme targeted four levels, i.e. adolescents in schools, school policies, parents and the community. In Portugal, 12.4% of the T1…

  10. A dynamic systems approach to psychotherapy: A meta-theoretical framework for explaining psychotherapy change processes.

    PubMed

    Gelo, Omar Carlo Gioacchino; Salvatore, Sergio

    2016-07-01

    Notwithstanding the many methodological advances made in the field of psychotherapy research, at present a metatheoretical, school-independent framework to explain psychotherapy change processes taking into account their dynamic and complex nature is still lacking. Over the last years, several authors have suggested that a dynamic systems (DS) approach might provide such a framework. In the present paper, we review the main characteristics of a DS approach to psychotherapy. After an overview of the general principles of the DS approach, we describe the extent to which psychotherapy can be considered as a self-organizing open complex system, whose developmental change processes are described in terms of a dialectic dynamics between stability and change over time. Empirical evidence in support of this conceptualization is provided and discussed. Finally, we propose a research design strategy for the empirical investigation of psychotherapy from a DS approach, together with a research case example. We conclude that a DS approach may provide a metatheoretical, school-independent framework allowing us to constructively rethink and enhance the way we conceptualize and empirically investigate psychotherapy. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Development of fuzzy multi-criteria approach to prioritize locations of treated wastewater use considering climate change scenarios.

    PubMed

    Chung, Eun-Sung; Kim, Yeonjoo

    2014-12-15

    This study proposed a robust prioritization framework to identify the priorities of treated wastewater (TWW) use locations with consideration of various uncertainties inherent in the climate change scenarios and the decision-making process. First, a fuzzy concept was applied because future forecast precipitation and their hydrological impact analysis results displayed significant variances when considering various climate change scenarios and long periods (e.g., 2010-2099). Second, various multi-criteria decision making (MCDM) techniques including weighted sum method (WSM), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and fuzzy TOPSIS were introduced to robust prioritization because different MCDM methods use different decision philosophies. Third, decision making method under complete uncertainty (DMCU) including maximin, maximax, minimax regret, Hurwicz, and equal likelihood were used to find robust final rankings. This framework is then applied to a Korean urban watershed. As a result, different rankings were obviously appeared between fuzzy TOPSIS and non-fuzzy MCDMs (e.g., WSM and TOPSIS) because the inter-annual variability in effectiveness was considered only with fuzzy TOPSIS. Then, robust prioritizations were derived based on 18 rankings from nine decadal periods of RCP4.5 and RCP8.5. For more robust rankings, five DMCU approaches using the rankings from fuzzy TOPSIS were derived. This framework combining fuzzy TOPSIS with DMCU approaches can be rendered less controversial among stakeholders under complete uncertainty of changing environments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. The role of simulation in mixed-methods research: a framework & application to patient safety.

    PubMed

    Guise, Jeanne-Marie; Hansen, Matthew; Lambert, William; O'Brien, Kerth

    2017-05-04

    Research in patient safety is an important area of health services research and is a national priority. It is challenging to investigate rare occurrences, explore potential causes, and account for the complex, dynamic context of healthcare - yet all are required in patient safety research. Simulation technologies have become widely accepted as education and clinical tools, but have yet to become a standard tool for research. We developed a framework for research that integrates accepted patient safety models with mixed-methods research approaches and describe the performance of the framework in a working example of a large National Institutes of Health (NIH)-funded R01 investigation. This worked example of a framework in action, identifies the strengths and limitations of qualitative and quantitative research approaches commonly used in health services research. Each approach builds essential layers of knowledge. We describe how the use of simulation ties these layers of knowledge together and adds new and unique dimensions of knowledge. A mixed-methods research approach that includes simulation provides a broad multi-dimensional approach to health services and patient safety research.

  13. A motion sensing-based framework for robotic manipulation.

    PubMed

    Deng, Hao; Xia, Zeyang; Weng, Shaokui; Gan, Yangzhou; Fang, Peng; Xiong, Jing

    2016-01-01

    To data, outside of the controlled environments, robots normally perform manipulation tasks operating with human. This pattern requires the robot operators with high technical skills training for varied teach-pendant operating system. Motion sensing technology, which enables human-machine interaction in a novel and natural interface using gestures, has crucially inspired us to adopt this user-friendly and straightforward operation mode on robotic manipulation. Thus, in this paper, we presented a motion sensing-based framework for robotic manipulation, which recognizes gesture commands captured from motion sensing input device and drives the action of robots. For compatibility, a general hardware interface layer was also developed in the framework. Simulation and physical experiments have been conducted for preliminary validation. The results have shown that the proposed framework is an effective approach for general robotic manipulation with motion sensing control.

  14. A comparison of the legal frameworks supporting water management in Europe and China.

    PubMed

    Yang, X; Griffiths, I M

    2010-01-01

    This paper has compared the legal frameworks supporting water management in Europe and China, with special focus on integrated river basin management (IRBM) to identify synergies and opportunities in policymaking and implementation. The research shows that China has committed to the efficient management of water resources through various policy tools during the current period. This commitment, however, has often been interrupted and distorted by politics, resulting in the neglect of socioeconomic and environmental priorities. The European legal framework supporting water management underwent a complex and lengthy development, but with the adoption of the Water Framework Directive provides a policy model on which to develop an integrated and sustainable approach to river basin management, elements of which may help to meet the demands of the emerging 21st century Chinese society on these critical natural resources.

  15. Gamifying Self-Management of Chronic Illnesses: A Mixed-Methods Study.

    PubMed

    AlMarshedi, Alaa; Wills, Gary; Ranchhod, Ashok

    2016-09-09

    Self-management of chronic illnesses is an ongoing issue in health care research. Gamification is a concept that arose in the field of computer science and has been borrowed by many other disciplines. It is perceived by many that gamification can improve the self-management experience of people with chronic illnesses. This paper discusses the validation of a framework (called The Wheel of Sukr) that was introduced to achieve this goal. This research aims to (1) discuss a gamification framework targeting the self-management of chronic illnesses and (2) validate the framework by diabetic patients, medical professionals, and game experts. A mixed-method approach was used to validate the framework. Expert interviews (N=8) were conducted in order to validate the themes of the framework. Additionally, diabetic participants completed a questionnaire (N=42) in order to measure their attitudes toward the themes of the framework. The results provide a validation of the framework. This indicates that gamification might improve the self-management of chronic illnesses, such as diabetes. Namely, the eight themes in the Wheel of Sukr (fun, esteem, socializing, self-management, self-representation, motivation, growth, sustainability) were perceived positively by 71% (30/42) of the participants with P value <.001. In this research, both the interviews and the questionnaire yielded positive results that validate the framework (The Wheel of Sukr). Generally, this study indicates an overall acceptance of the notion of gamification in the self-management of diabetes.

  16. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Issen, Kathleen

    2017-06-05

    This project employed a continuum approach to formulate an elastic constitutive model for Castlegate sandstone. The resulting constitutive framework for high porosity sandstone is thermodynamically sound, (i.e., does not violate the 1st and 2nd law of thermodynamics), represents known material constitutive response, and is able to be calibrated using available mechanical response data. To authenticate the accuracy of this model, a series of validation criteria were employed, using an existing mechanical response data set for Castlegate sandstone. The resulting constitutive framework is applicable to high porosity sandstones in general, and is tractable for scientists and researchers endeavoring to solve problemsmore » of practical interest.« less

  17. Dimensions of lay health worker programmes: results of a scoping study and production of a descriptive framework.

    PubMed

    South, Jane; Meah, Angela; Bagnall, Anne-Marie; Jones, Rebecca

    2013-03-01

    Approaches that engage and support lay health workers in the delivery of health improvement activities have been widely applied across different health issues and populations. The lack of a common terminology, inconsistency in the use of role descriptors and poor indexing of lay health worker roles are all barriers to the development of a shared evidence base for lay health worker interventions. The aim of the paper is to report results from a scoping study of approaches to involve lay people in public health roles and to present a framework for categorisation of the different dimensions of lay health worker programmes. Our scoping study comprised a systematic scoping review to map the literature on lay health worker interventions and to identify role dimensions and common models. The review, which was limited to interventions relevant to UK public health priorities, covered a total of 224 publications. The scoping study also drew on experiential evidence from UK practice. Research-based and practice-based evidence confirmed the variety of role descriptors in use and the complexity of role dimensions. Five common models that define the primary role of the lay health worker were identified from the literature. A framework was later developed that grouped features of lay health worker programmes into four dimensions: intervention, role, professional support/service and the community. More account needs to be taken of the variations that occur between lay health worker programmes. This framework, with the mapping of key categories of difference, may enable better description of lay health worker programmes, which will in turn assist in building a shared evidence base. More research is needed to examine the transferability of the framework within different contexts.

  18. 'Healthy Eating and Lifestyle in Pregnancy (HELP)' trial: Process evaluation framework.

    PubMed

    Simpson, Sharon A; Cassidy, Dunla; John, Elinor

    2014-07-01

    We developed and tested in a cluster RCT a theory-driven group-based intervention for obese pregnant women. It was designed to support women to moderate weight gain during pregnancy and reduce BMI one year after birth, in addition to targeting secondary health and wellbeing outcomes. In line with MRC guidance on developing and evaluating complex interventions in health, we conducted a process evaluation alongside the trial. This paper describes the development of the process evaluation framework. This cluster RCT recruited 598 pregnant women. Women in the intervention group were invited to attend a weekly weight-management group. Following a review of relevant literature, we developed a process evaluation framework which outlined key process indicators that we wanted to address and how we would measure these. Central to the process evaluation was to understand the mechanism of effect of the intervention. We utilised a logic-modelling approach to describe the intervention which helped us focus on what potential mediators of intervention effect to measure, and how. The resulting process evaluation framework was designed to address 9 core elements; context, reach, exposure, recruitment, fidelity, recruitment, retention, contamination and theory-testing. These were assessed using a variety of qualitative and quantitative approaches. The logic model explained the processes by which intervention components bring about change in target outcomes through various mediators and theoretical pathways including self-efficacy, social support, self-regulation and motivation. Process evaluation is a key element in assessing the effect of any RCT. We developed a process evaluation framework and logic model, and the results of analyses using these will offer insights into why the intervention is or is not effective. Copyright © 2014.

  19. Ensemble Semi-supervised Frame-work for Brain Magnetic Resonance Imaging Tissue Segmentation

    PubMed Central

    Azmi, Reza; Pishgoo, Boshra; Norozi, Narges; Yeganeh, Samira

    2013-01-01

    Brain magnetic resonance images (MRIs) tissue segmentation is one of the most important parts of the clinical diagnostic tools. Pixel classification methods have been frequently used in the image segmentation with two supervised and unsupervised approaches up to now. Supervised segmentation methods lead to high accuracy, but they need a large amount of labeled data, which is hard, expensive, and slow to obtain. Moreover, they cannot use unlabeled data to train classifiers. On the other hand, unsupervised segmentation methods have no prior knowledge and lead to low level of performance. However, semi-supervised learning which uses a few labeled data together with a large amount of unlabeled data causes higher accuracy with less trouble. In this paper, we propose an ensemble semi-supervised frame-work for segmenting of brain magnetic resonance imaging (MRI) tissues that it has been used results of several semi-supervised classifiers simultaneously. Selecting appropriate classifiers has a significant role in the performance of this frame-work. Hence, in this paper, we present two semi-supervised algorithms expectation filtering maximization and MCo_Training that are improved versions of semi-supervised methods expectation maximization and Co_Training and increase segmentation accuracy. Afterward, we use these improved classifiers together with graph-based semi-supervised classifier as components of the ensemble frame-work. Experimental results show that performance of segmentation in this approach is higher than both supervised methods and the individual semi-supervised classifiers. PMID:24098863

  20. Risk Assessment and Risk Governance of Liquefied Natural Gas Development in Gladstone, Australia.

    PubMed

    van der Vegt, R G

    2018-02-26

    This article is a retrospective analysis of liquefied natural gas development (LNG) in Gladstone, Australia by using the structure of the risk governance framework developed by the International Risk Governance Council (IRGC). Since 2010 the port of Gladstone has undergone extensive expansion to facilitate the increasing coal export as well as the new development of three recently completed LNG facilities. Significant environmental and socio-economic impacts and concerns have occurred as a result of these developments. The overall aim of the article, therefore, is to identify the risk governance deficits that arose and to formulate processes capable of improving similar decision-making problems in the future. The structure of the IRGC framework is followed because it represents a broad analytical approach for considering risk assessment and risk governance in Gladstone in ways that include, but also go beyond, the risk approach of the ISO 31000:2009 standard that was employed at the time. The IRGC risk framework is argued to be a consistent and comprehensive risk governance framework that integrates scientific, economic, social, and cultural aspects and advocates the notion of inclusive risk governance through stakeholder communication and involvement. Key aspects related to risk preassessment, risk appraisal, risk tolerability and acceptability, risk management, and stakeholder communication and involvement are considered. The results indicate that the risk governance deficits include aspects related to (i) the risk matrix methodology, (ii) reflecting uncertainties, (iii) cumulative risks, (iv) the regulatory process, and (v) stakeholder communication and involvement. © 2018 Society for Risk Analysis.

  1. Methods for integrating moderation and mediation: a general analytical framework using moderated path analysis.

    PubMed

    Edwards, Jeffrey R; Lambert, Lisa Schurer

    2007-03-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.

  2. Simple approach in understanding interzeolite transformations using ring building units

    NASA Astrophysics Data System (ADS)

    Suhendar, D.; Buchari; Mukti, R. R.; Ismunandar

    2018-04-01

    Recently, there are two general approaches used in understanding interzeolite transformations, thermodynamically represented by framework density (FD) and kinetically by structural building units. Two types of structural building units are composite building units (CBU’s) and secondary building units (SBU’s). This study aims to examine the approaches by using interzeolite transformation data available in literature and propose a possible alternative approach. From a number of cases of zeolite transformation, the FD and CBU approach are not suitable for use. The FD approach fails in cases involving zeolite parents that have moderate or high FD’s, while CBU approach fails because of CBU’s unavailability in parent zeolites compared with CBU’s in their transformation products. The SBU approach is most likely to fit because SBU’s are units that have basic form of ring structures and closer to the state and shape of oligomeric fragments present in zeolite synthesis or dissolution cases. Thus, a new approach can be considered in understanding the interzeolite transformation, namely the ring building unit (RBU) approach. The advantage of RBU approach is RBU’s can be easily derived from all framework types, but in SBU approach there are several types of frameworks that cannot be expressed in SBU forms.

  3. From framework to action: the DESIRE approach to combat desertification.

    PubMed

    Hessel, R; Reed, M S; Geeson, N; Ritsema, C J; van Lynden, G; Karavitis, C A; Schwilch, G; Jetten, V; Burger, P; van der Werff Ten Bosch, M J; Verzandvoort, S; van den Elsen, E; Witsenburg, K

    2014-11-01

    It has become increasingly clear that desertification can only be tackled through a multi-disciplinary approach that not only involves scientists but also stakeholders. In the DESIRE project such an approach was taken. As a first step, a conceptual framework was developed in which the factors and processes that may lead to land degradation and desertification were described. Many of these factors do not work independently, but can reinforce or weaken one another, and to illustrate these relationships sustainable management and policy feedback loops were included. This conceptual framework can be applied globally, but can also be made site-specific to take into account that each study site has a unique combination of bio-physical, socio-economic and political conditions. Once the conceptual framework was defined, a methodological framework was developed in which the methodological steps taken in the DESIRE approach were listed and their logic and sequence were explained. The last step was to develop a concrete working plan to put the project into action, involving stakeholders throughout the process. This series of steps, in full or in part, offers explicit guidance for other organizations or projects that aim to reduce land degradation and desertification.

  4. A novel framework for analyzing conservation impacts: evaluation, theory, and marine protected areas.

    PubMed

    Mascia, Michael B; Fox, Helen E; Glew, Louise; Ahmadia, Gabby N; Agrawal, Arun; Barnes, Megan; Basurto, Xavier; Craigie, Ian; Darling, Emily; Geldmann, Jonas; Gill, David; Holst Rice, Susie; Jensen, Olaf P; Lester, Sarah E; McConney, Patrick; Mumby, Peter J; Nenadovic, Mateja; Parks, John E; Pomeroy, Robert S; White, Alan T

    2017-07-01

    Environmental conservation initiatives, including marine protected areas (MPAs), have proliferated in recent decades. Designed to conserve marine biodiversity, many MPAs also seek to foster sustainable development. As is the case for many other environmental policies and programs, the impacts of MPAs are poorly understood. Social-ecological systems, impact evaluation, and common-pool resource governance are three complementary scientific frameworks for documenting and explaining the ecological and social impacts of conservation interventions. We review key components of these three frameworks and their implications for the study of conservation policy, program, and project outcomes. Using MPAs as an illustrative example, we then draw upon these three frameworks to describe an integrated approach for rigorous empirical documentation and causal explanation of conservation impacts. This integrated three-framework approach for impact evaluation of governance in social-ecological systems (3FIGS) accounts for alternative explanations, builds upon and advances social theory, and provides novel policy insights in ways that no single approach affords. Despite the inherent complexity of social-ecological systems and the difficulty of causal inference, the 3FIGS approach can dramatically advance our understanding of, and the evidentiary basis for, effective MPAs and other conservation initiatives. © 2017 New York Academy of Sciences.

  5. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  6. The role of language in learning physics

    NASA Astrophysics Data System (ADS)

    Brookes, David T.

    Many studies in PER suggest that language poses a serious difficulty for students learning physics. These difficulties are mostly attributed to misunderstanding of specialized terminology. This terminology often assigns new meanings to everyday terms used to describe physical models and phenomena. In this dissertation I present a novel approach to analyzing of the role of language in learning physics. This approach is based on the analysis of the historical development of physics ideas, the language of modern physicists, and students' difficulties in the areas of quantum mechanics, classical mechanics, and thermodynamics. These data are analyzed using linguistic tools borrowed from cognitive linguistics and systemic functional grammar. Specifically, I combine the idea of conceptual metaphor and grammar to build a theoretical framework that accounts for: (1) the role and function that language serves for physicists when they speak and reason about physical ideas and phenomena, (2) specific features of students' reasoning and difficulties that may be related to or derived from language that students read or hear. The theoretical framework is developed using the methodology of a grounded theoretical approach. The theoretical framework allows us to make predictions about the relationship between student discourse and their conceptual and problem solving difficulties. Tests of the theoretical framework are presented in the context of "heat" in thermodynamics and "force" in dynamics. In each case the language that students use to reason about the concepts of "heat" and "force" is analyzed using the theoretical framework. The results of this analysis show that language is very important in students' learning. In particular, students are (1) using features of physicists' conceptual metaphors to reason about physical phenomena, often overextending and misapplying these features, (2) drawing cues from the grammar of physicists' speech and writing to categorize physics concepts; this categorization of physics concepts plays a key role in students' ability to solve physics problems. In summary, I present a theoretical framework that provides a possible explanation of the role that language plays in learning physics. The framework also attempts to account for how and why physicists' language influences students in the way that it does.

  7. Consensus–based approach to develop a measurement framework and identify a core set of indicators to track implementation and progress towards effective coverage of facility–based Kangaroo Mother Care

    PubMed Central

    Guenther, Tanya; Moxon, Sarah; Valsangkar, Bina; Wetzel, Greta; Ruiz, Juan; Kerber, Kate; Blencowe, Hannah; Dube, Queen; Vani, Shashi N; Vivio, Donna; Magge, Hema; De Leon–Mendoza, Socorro; Patterson, Janna; Mazia, Goldy

    2017-01-01

    Background As efforts to scale up the delivery of Kangaroo Mother Care (KMC) in facilities are increasing, a standardized approach to measure implementation and progress towards effective coverage is needed. Here, we describe a consensus–based approach to develop a measurement framework and identify a core set of indicators for monitoring facility–based KMC that would be feasible to measure within existing systems. Methods The KMC measurement framework and core list of indicators were developed through: 1) scoping exercise to identify potential indicators through literature review and requests from researchers and program implementers; and 2) face–to–face consultations with KMC and measurement experts working at country and global levels to review candidate indicators and finalize selection and definitions. Results The KMC measurement framework includes two main components: 1) service readiness, based on the WHO building blocks framework; and 2) service delivery action sequence covering identification, service initiation, continuation to discharge, and follow–up to graduation. Consensus was reached on 10 core indicators for KMC, which were organized according to the measurement framework. We identified 4 service readiness indicators, capturing national level policy for KMC, availability of KMC indicators in HMIS, costed operational plans for KMC and availability of KMC services at health facilities with inpatient maternity services. Six indicators were defined for service delivery, including weighing of babies at birth, identification of those ≤2000 g, initiation of facility–based KMC, monitoring the quality of KMC, status of babies at discharge from the facility and levels of follow–up (according to country–specific protocol). Conclusions These core KMC indicators, identified with input from a wide range of global and country–level KMC and measurement experts, can aid efforts to strengthen monitoring systems and facilitate global tracking of KMC implementation. As data collection systems advance, we encourage program managers and evaluators to document their experiences using this framework to measure progress and allow indicator refinement, with the overall aim of working towards sustainable, country–led data systems. PMID:29057074

  8. Structural controllability of unidirectional bipartite networks

    NASA Astrophysics Data System (ADS)

    Nacher, Jose C.; Akutsu, Tatsuya

    2013-04-01

    The interactions between fundamental life molecules, people and social organisations build complex architectures that often result in undesired behaviours. Despite all of the advances made in our understanding of network structures over the past decade, similar progress has not been achieved in the controllability of real-world networks. In particular, an analytical framework to address the controllability of bipartite networks is still absent. Here, we present a dominating set (DS)-based approach to bipartite network controllability that identifies the topologies that are relatively easy to control with the minimum number of driver nodes. Our theoretical calculations, assisted by computer simulations and an evaluation of real-world networks offer a promising framework to control unidirectional bipartite networks. Our analysis should open a new approach to reverting the undesired behaviours in unidirectional bipartite networks at will.

  9. Neurocognitive therapeutics: from concept to application in the treatment of negative attention bias.

    PubMed

    Schnyer, David M; Beevers, Christopher G; deBettencourt, Megan T; Sherman, Stephanie M; Cohen, Jonathan D; Norman, Kenneth A; Turk-Browne, Nicholas B

    2015-01-01

    There is growing interest in the use of neuroimaging for the direct treatment of mental illness. Here, we present a new framework for such treatment, neurocognitive therapeutics. What distinguishes neurocognitive therapeutics from prior approaches is the use of precise brain-decoding techniques within a real-time feedback system, in order to adapt treatment online and tailor feedback to individuals' needs. We report an initial feasibility study that uses this framework to alter negative attention bias in a small number of patients experiencing significant mood symptoms. The results are consistent with the promise of neurocognitive therapeutics to improve mood symptoms and alter brain networks mediating attentional control. Future work should focus on optimizing the approach, validating its effectiveness, and expanding the scope of targeted disorders.

  10. New approach for determination of the influence of long-range order and selected ring oscillations on IR spectra in zeolites

    NASA Astrophysics Data System (ADS)

    Mikuła, Andrzej; Król, Magdalena; Mozgawa, Włodzimierz; Koleżyński, Andrzej

    2018-04-01

    Vibrational spectroscopy can be considered as one of the most important methods used for structural characterization of various porous aluminosilicate materials, including zeolites. On the other hand, vibrational spectra of zeolites are still difficult to interpret, particularly in the pseudolattice region, where bands related to ring oscillations can be observed. Using combination of theoretical and computational approach, a detailed analysis of these regions of spectra is possible; such analysis should be, however, carried out employing models with different level of complexity and simultaneously the same theory level. In this work, an attempt was made to identify ring oscillations in vibrational spectra of selected zeolite structures. A series of ab initio calculations focused on S4R, S6R, and as a novelty, 5-1 isolated clusters, as well as periodic siliceous frameworks built from those building units (ferrierite (FER), mordenite (MOR) and heulandite (HEU) type) have been carried out. Due to the hierarchical structure of zeolite frameworks it can be expected that the total envelope of the zeolite spectra should be with good accuracy a sum of the spectra of structural elements that build each zeolite framework. Based on the results of HF calculations, normal vibrations have been visualized and detailed analysis of pseudolattice range of resulting theoretical spectra have been carried out. Obtained results have been applied for interpretation of experimental spectra of selected zeolites.

  11. Deformation-Aware Log-Linear Models

    NASA Astrophysics Data System (ADS)

    Gass, Tobias; Deselaers, Thomas; Ney, Hermann

    In this paper, we present a novel deformation-aware discriminative model for handwritten digit recognition. Unlike previous approaches our model directly considers image deformations and allows discriminative training of all parameters, including those accounting for non-linear transformations of the image. This is achieved by extending a log-linear framework to incorporate a latent deformation variable. The resulting model has an order of magnitude less parameters than competing approaches to handling image deformations. We tune and evaluate our approach on the USPS task and show its generalization capabilities by applying the tuned model to the MNIST task. We gain interesting insights and achieve highly competitive results on both tasks.

  12. Using the ecological framework to identify barriers and enablers to implementing Namaste Care in Canada's long-term care system.

    PubMed

    Hunter, Paulette V; Kaasalainen, Sharon; Froggatt, Katherine A; Ploeg, Jenny; Dolovich, Lisa; Simard, Joyce; Salsali, Mahvash

    2017-10-01

    Higher acuity of care at the time of admission to long-term care (LTC) is resulting in a shorter period to time of death, yet most LTC homes in Canada do not have formalized approaches to palliative care. Namaste Care is a palliative care approach specifically tailored to persons with advanced cognitive impairment who are living in LTC. The purpose of this study was to employ the ecological framework to identify barriers and enablers to an implementation of Namaste Care. Six group interviews were conducted with families, unlicensed staff, and licensed staff at two Canadian LTC homes that were planning to implement Namaste Care. None of the interviewees had prior experience implementing Namaste Care. The resulting qualitative data were analyzed using a template organizing approach. We found that the strongest implementation enablers were positive perceptions of need for the program, benefits of the program, and fit within a resident-centred or palliative approach to care. Barriers included a generally low resource base for LTC, the need to adjust highly developed routines to accommodate the program, and reliance on a casual work force. We conclude that within the Canadian LTC system, positive perceptions of Namaste Care are tempered by concerns about organizational capacity to support new programming.

  13. Overcoming the matched-sample bottleneck: an orthogonal approach to integrate omic data.

    PubMed

    Nguyen, Tin; Diaz, Diana; Tagett, Rebecca; Draghici, Sorin

    2016-07-12

    MicroRNAs (miRNAs) are small non-coding RNA molecules whose primary function is to regulate the expression of gene products via hybridization to mRNA transcripts, resulting in suppression of translation or mRNA degradation. Although miRNAs have been implicated in complex diseases, including cancer, their impact on distinct biological pathways and phenotypes is largely unknown. Current integration approaches require sample-matched miRNA/mRNA datasets, resulting in limited applicability in practice. Since these approaches cannot integrate heterogeneous information available across independent experiments, they neither account for bias inherent in individual studies, nor do they benefit from increased sample size. Here we present a novel framework able to integrate miRNA and mRNA data (vertical data integration) available in independent studies (horizontal meta-analysis) allowing for a comprehensive analysis of the given phenotypes. To demonstrate the utility of our method, we conducted a meta-analysis of pancreatic and colorectal cancer, using 1,471 samples from 15 mRNA and 14 miRNA expression datasets. Our two-dimensional data integration approach greatly increases the power of statistical analysis and correctly identifies pathways known to be implicated in the phenotypes. The proposed framework is sufficiently general to integrate other types of data obtained from high-throughput assays.

  14. A framework for managing runoff and pollution in the rural landscape using a Catchment Systems Engineering approach.

    PubMed

    Wilkinson, M E; Quinn, P F; Barber, N J; Jonczyk, J

    2014-01-15

    Intense farming plays a key role in increasing local scale runoff and erosion rates, resulting in water quality issues and flooding problems. There is potential for agricultural management to become a major part of improved strategies for controlling runoff. Here, a Catchment Systems Engineering (CSE) approach has been explored to solve the above problem. CSE is an interventionist approach to altering the catchment scale runoff regime through the manipulation of hydrological flow pathways throughout the catchment. By targeting hydrological flow pathways at source, such as overland flow, field drain and ditch function, a significant component of the runoff generation can be managed in turn reducing soil nutrient losses. The Belford catchment (5.7 km(2)) is a catchment scale study for which a CSE approach has been used to tackle a number of environmental issues. A variety of Runoff Attenuation Features (RAFs) have been implemented throughout the catchment to address diffuse pollution and flooding issues. The RAFs include bunds disconnecting flow pathways, diversion structures in ditches to spill and store high flows, large wood debris structure within the channel, and riparian zone management. Here a framework for applying a CSE approach to the catchment is shown in a step by step guide to implementing mitigation measures in the Belford Burn catchment. The framework is based around engagement with catchment stakeholders and uses evidence arising from field science. Using the framework, the flooding issue has been addressed at the catchment scale by altering the runoff regime. Initial findings suggest that RAFs have functioned as designed to reduce/attenuate runoff locally. However, evidence suggested that some RAFs needed modification and new RAFs be created to address diffuse pollution issues during storm events. Initial findings from these modified RAFs are showing improvements in sediment trapping capacities and reductions in phosphorus, nitrate and suspended sediment losses during storm events. © 2013.

  15. Dynamic and scalable audio classification by collective network of binary classifiers framework: an evolutionary approach.

    PubMed

    Kiranyaz, Serkan; Mäkinen, Toni; Gabbouj, Moncef

    2012-10-01

    In this paper, we propose a novel framework based on a collective network of evolutionary binary classifiers (CNBC) to address the problems of feature and class scalability. The main goal of the proposed framework is to achieve a high classification performance over dynamic audio and video repositories. The proposed framework adopts a "Divide and Conquer" approach in which an individual network of binary classifiers (NBC) is allocated to discriminate each audio class. An evolutionary search is applied to find the best binary classifier in each NBC with respect to a given criterion. Through the incremental evolution sessions, the CNBC framework can dynamically adapt to each new incoming class or feature set without resorting to a full-scale re-training or re-configuration. Therefore, the CNBC framework is particularly designed for dynamically varying databases where no conventional static classifiers can adapt to such changes. In short, it is entirely a novel topology, an unprecedented approach for dynamic, content/data adaptive and scalable audio classification. A large set of audio features can be effectively used in the framework, where the CNBCs make appropriate selections and combinations so as to achieve the highest discrimination among individual audio classes. Experiments demonstrate a high classification accuracy (above 90%) and efficiency of the proposed framework over large and dynamic audio databases. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study.

    PubMed

    Brunner, Melissa; McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim

    2018-05-15

    The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development programs. ©Melissa Brunner, Deborah McGregor, Melanie Keep, Anna Janssen, Heiko Spallek, Deleana Quinn, Aaron Jones, Emma Tseris, Wilson Yeung, Leanne Togher, Annette Solman, Tim Shaw. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.05.2018.

  17. A Health Economics Approach to US Value Assessment Frameworks-Introduction: An ISPOR Special Task Force Report [1].

    PubMed

    Neumann, Peter J; Willke, Richard J; Garrison, Louis P

    2018-02-01

    Concerns about rising spending on prescription drugs and other areas of health care have led to multiple initiatives in the United States designed to measure and communicate the value of pharmaceuticals and other technologies for decision making. In this section we introduce the work of the International Society for Pharmacoeconomics and Outcomes Research Special Task Force on US Value Assessment Frameworks formed to review relevant perspectives and appropriate approaches and methods to support the definition and use of high-quality value frameworks. The Special Task Force was part of the International Society for Pharmacoeconomics and Outcomes Research Initiative on US Value Assessment Frameworks, which enlisted the expertise of leading health economists, concentrating on what the field of health economics can provide to help inform the development and use of value assessment frameworks. We focus on five value framework initiatives: the American College of Cardiology/American Heart Association, the American Society of Clinical Oncology, the Institute for Clinical and Economic Review, the Memorial Sloan Kettering Cancer Center, and the National Comprehensive Cancer Network. These entities differ in their missions, scope of activities, and methodological approaches. Because they are gaining visibility and some traction in the United States, it is essential to scrutinize whether the frameworks use approaches that are transparent as well as conceptually and methodologically sound. Our objectives were to describe the conceptual bases for value and its use in decision making, critically examine existing value frameworks, discuss the importance of sound conceptual underpinning, identify key elements of value relevant to specific decision contexts, and recommend good practice in value definition and implementation as well as areas for further research. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  18. Professional Ethics of Software Engineers: An Ethical Framework.

    PubMed

    Lurie, Yotam; Mark, Shlomo

    2016-04-01

    The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process.

  19. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  20. Transcultural Responses to Aesthetic and Therapeutic Experience: An Ethological Approach.

    ERIC Educational Resources Information Center

    Henley, David R.

    1994-01-01

    Focuses on group dynamics of multiethnic peoples interacting with installation sculpture by Jean Dubuffet. Identifies those dynamics that become activated when responding to an aesthetic stimulus. Considers resulting behaviors within ethological/phenomenological framework and applied to practice of art therapy. (NB)

  1. An Integrative Data Mining Approach to Identify Adverse Outcome Pathway Signatures

    EPA Science Inventory

    Adverse Outcome Pathways (AOPs) provide a formal framework for describing the mechanisms underlying the toxicity of chemicals in our environment. This process improves our ability to incorporate high-throughput toxicity testing (HTT) results and biomarker information on early key...

  2. Navigating the sustainability landscape: a systematic review of sustainability approaches in healthcare.

    PubMed

    Lennox, L; Maher, L; Reed, J

    2018-02-09

    Improvement initiatives offer a valuable mechanism for delivering and testing innovations in healthcare settings. Many of these initiatives deliver meaningful and necessary changes to patient care and outcomes. However, many improvement initiatives fail to sustain to a point where their full benefits can be realised. This has led many researchers and healthcare practitioners to develop frameworks, models and tools to support and monitor sustainability. This work aimed to identify what approaches are available to assess and influence sustainability in healthcare and to describe the different perspectives, applications and constructs within these approaches to guide their future use. A systematic review was carried out following PRISMA guidelines to identify publications that reported approaches to support or influence sustainability in healthcare. Eligibility criteria were defined through an iterative process in which two reviewers independently assessed 20% of articles to test the objectivity of the selection criteria. Data were extracted from the identified articles, and a template analysis was undertaken to identify and assess the sustainability constructs within each reported approach. The search strategy identified 1748 publications with 227 articles retrieved in full text for full documentary analysis. In total, 62 publications identifying a sustainability approach were included in this review (32 frameworks, 16 models, 8 tools, 4 strategies, 1 checklist and 1 process). Constructs across approaches were compared and 40 individual constructs for sustainability were found. Comparison across approaches demonstrated consistent constructs were seen regardless of proposed interventions, setting or level of application with 6 constructs included in 75% of the approaches. Although similarities were found, no approaches contained the same combination of the constructs nor did any single approach capture all identified constructs. From these results, a consolidated framework for sustainability constructs in healthcare was developed. Choosing a sustainability method can pose a challenge because of the diverse approaches reported in the literature. This review provides a valuable resource to researchers, healthcare professionals and improvement practitioners by providing a summary of available sustainability approaches and their characteristics. This review was registered on the PROSPERO database: CRD42016040081 in June 2016.

  3. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  4. Health, Safety, and Environmental Screening and Ranking Frameworkfor Geologic CO2 Storage Site Selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldenburg, Curtis M.

    2005-09-19

    This report describes a screening and ranking framework(SRF) developed to evaluate potential geologic carbon dioxide (CO2) storage sites on the basis of health, safety, and environmental (HSE) risk arising from possible CO2 leakage. The approach is based on the assumption that HSE risk due to CO2 leakage is dependent on three basic characteristics of a geologic CO2 storage site: (1) the potential for primary containment by the target formation; (2) the potential for secondary containment if the primary formation leaks; and (3) the potential for attenuation and dispersion of leaking CO2 if the primary formation leaks and secondary containment fails.more » The framework is implemented in a spreadsheet in which users enter numerical scores representing expert opinions or general information available from published materials along with estimates of uncertainty to evaluate the three basic characteristics in order to screen and rank candidate sites. Application of the framework to the Rio Visa Gas Field, Ventura Oil Field, and Mammoth Mountain demonstrates the approach. Refinements and extensions are possible through the use of more detailed data or model results in place of property proxies. Revisions and extensions to improve the approach are anticipated in the near future as it is used and tested by colleagues and collaborators.« less

  5. Reconciling farming and wild nature: Integrating human-wildlife coexistence into the land-sharing and land-sparing framework.

    PubMed

    Crespin, Silvio J; Simonetti, Javier A

    2018-05-11

    Land has traditionally been spared to protect biodiversity; however, this approach has not succeeded by itself and requires a complementary strategy in human-dominated landscapes: land-sharing. Human-wildlife conflicts are rampant in a land-sharing context where wildlife co-occur with crops or livestock, but whose resulting interactions adversely affect the wellbeing of land owners, ultimately impeding coexistence. Therefore, true land-sharing only works if coexistence is also considered an end goal. We reviewed the literature on land-sharing and found that conflicts have not yet found their way into the land-sharing/sparing framework, with wildlife and humans co-occurring without coexisting in a dynamic process. To successfully implement a land-sharing approach, we must first acknowledge our failure to integrate the body of work on human-wildlife conflicts into the framework and work to implement multidisciplinary approaches from the ecological, economic, and sociological sciences to overcome and prevent conflicts. We suggest the use of Conflict Transformation by means of the Levels of Conflict Model to perceive both visible and deep-rooted causes of conflicts as opportunities to create problem-solving dynamics in affected socio-ecological landscapes. Reconciling farming and nature is possible by aiming for a transition to landscapes that truly share space by virtue of coexistence.

  6. EEG-fMRI Bayesian framework for neural activity estimation: a simulation study

    NASA Astrophysics Data System (ADS)

    Croce, Pierpaolo; Basti, Alessio; Marzetti, Laura; Zappasodi, Filippo; Del Gratta, Cosimo

    2016-12-01

    Objective. Due to the complementary nature of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI), and given the possibility of simultaneous acquisition, the joint data analysis can afford a better understanding of the underlying neural activity estimation. In this simulation study we want to show the benefit of the joint EEG-fMRI neural activity estimation in a Bayesian framework. Approach. We built a dynamic Bayesian framework in order to perform joint EEG-fMRI neural activity time course estimation. The neural activity is originated by a given brain area and detected by means of both measurement techniques. We have chosen a resting state neural activity situation to address the worst case in terms of the signal-to-noise ratio. To infer information by EEG and fMRI concurrently we used a tool belonging to the sequential Monte Carlo (SMC) methods: the particle filter (PF). Main results. First, despite a high computational cost, we showed the feasibility of such an approach. Second, we obtained an improvement in neural activity reconstruction when using both EEG and fMRI measurements. Significance. The proposed simulation shows the improvements in neural activity reconstruction with EEG-fMRI simultaneous data. The application of such an approach to real data allows a better comprehension of the neural dynamics.

  7. A three-tiered approach for linking pharmacokinetic ...

    EPA Pesticide Factsheets

    The power of the adverse outcome pathway (AOP) framework arises from its utilization of pathway-based data to describe the initial interaction of a chemical with a molecular target (molecular initiating event; (MIE), followed by a progression through a series of key events that lead to an adverse outcome relevant for regulatory purposes. The AOP itself is not chemical specific, thus providing the biological context necessary for interpreting high throughput (HT) toxicity screening results. Application of the AOP framework and HT predictions in ecological and human health risk assessment, however, requires the consideration of chemical-specific properties that influence external exposure doses and target tissue doses. To address this requirement, a three-tiered approach was developed to provide a workflow for connecting biology-based AOPs to biochemical-based pharmacokinetic properties (absorption, distribution, metabolism, excretion; ADME), and then to chemical/human activity-based exposure pathways. This approach included: (1) The power of the adverse outcome pathway (AOP) framework arisesfrom its utilization of pathway-based data to describe the initial interaction of a chemical with a molecular target (molecular initiating event; (MIE), followed by a progression through a series of key events that lead to an adverse outcome relevant for regulatory purposes. The AOP itself is not chemical specific, thus providing the biological context necessary for interpreti

  8. User-Centric Approach for Benchmark RDF Data Generator in Big Data Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purohit, Sumit; Paulson, Patrick R.; Rodriguez, Luke R.

    This research focuses on user-centric approach of building such tools and proposes a flexible, extensible, and easy to use framework to support performance analysis of Big Data systems. Finally, case studies from two different domains are presented to validate the framework.

  9. Spin-phase-space-entropy production

    NASA Astrophysics Data System (ADS)

    Santos, Jader P.; Céleri, Lucas C.; Brito, Frederico; Landi, Gabriel T.; Paternostro, Mauro

    2018-05-01

    Quantifying the degree of irreversibility of an open system dynamics represents a problem of both fundamental and applied relevance. Even though a well-known framework exists for thermal baths, the results give diverging results in the limit of zero temperature and are also not readily extended to nonequilibrium reservoirs, such as dephasing baths. Aimed at filling this gap, in this paper we introduce a phase-space-entropy production framework for quantifying the irreversibility of spin systems undergoing Lindblad dynamics. The theory is based on the spin Husimi-Q function and its corresponding phase-space entropy, known as Wehrl entropy. Unlike the von Neumann entropy production rate, we show that in our framework, the Wehrl entropy production rate remains valid at any temperature and is also readily extended to arbitrary nonequilibrium baths. As an application, we discuss the irreversibility associated with the interaction of a two-level system with a single-photon pulse, a problem which cannot be treated using the conventional approach.

  10. A New Framework of Removing Salt and Pepper Impulse Noise for the Noisy Image Including Many Noise-Free White and Black Pixels

    NASA Astrophysics Data System (ADS)

    Li, Song; Wang, Caizhu; Li, Yeqiu; Wang, Ling; Sakata, Shiro; Sekiya, Hiroo; Kuroiwa, Shingo

    In this paper, we propose a new framework of removing salt and pepper impulse noise. In our proposed framework, the most important point is that the number of noise-free white and black pixels in a noisy image can be determined by using the noise rates estimated by Fuzzy Impulse Noise Detection and Reduction Method (FINDRM) and Efficient Detail-Preserving Approach (EDPA). For the noisy image includes many noise-free white and black pixels, the detected noisy pixel from the FINDRM is re-checked by using the alpha-trimmed mean. Finally, the impulse noise filtering phase of the FINDRM is used to restore the image. Simulation results show that for the noisy image including many noise-free white and black pixels, the proposed framework can decrease the False Hit Rate (FHR) efficiently compared with the FINDRM. Therefore, the proposed framework can be used more widely than the FINDRM.

  11. Framework to model neutral particle flux in convex high aspect ratio structures using one-dimensional radiosity

    NASA Astrophysics Data System (ADS)

    Manstetten, Paul; Filipovic, Lado; Hössinger, Andreas; Weinbub, Josef; Selberherr, Siegfried

    2017-02-01

    We present a computationally efficient framework to compute the neutral flux in high aspect ratio structures during three-dimensional plasma etching simulations. The framework is based on a one-dimensional radiosity approach and is applicable to simulations of convex rotationally symmetric holes and convex symmetric trenches with a constant cross-section. The framework is intended to replace the full three-dimensional simulation step required to calculate the neutral flux during plasma etching simulations. Especially for high aspect ratio structures, the computational effort, required to perform the full three-dimensional simulation of the neutral flux at the desired spatial resolution, conflicts with practical simulation time constraints. Our results are in agreement with those obtained by three-dimensional Monte Carlo based ray tracing simulations for various aspect ratios and convex geometries. With this framework we present a comprehensive analysis of the influence of the geometrical properties of high aspect ratio structures as well as of the particle sticking probability on the neutral particle flux.

  12. Optimal Wastewater Loading under Conflicting Goals and Technology Limitations in a Riverine System.

    PubMed

    Rafiee, Mojtaba; Lyon, Steve W; Zahraie, Banafsheh; Destouni, Georgia; Jaafarzadeh, Nemat

    2017-03-01

      This paper investigates a novel simulation-optimization (S-O) framework for identifying optimal treatment levels and treatment processes for multiple wastewater dischargers to rivers. A commonly used water quality simulation model, Qual2K, was linked to a Genetic Algorithm optimization model for exploration of relevant fuzzy objective-function formulations for addressing imprecision and conflicting goals of pollution control agencies and various dischargers. Results showed a dynamic flow dependence of optimal wastewater loading with good convergence to near global optimum. Explicit considerations of real-world technological limitations, which were developed here in a new S-O framework, led to better compromise solutions between conflicting goals than those identified within traditional S-O frameworks. The newly developed framework, in addition to being more technologically realistic, is also less complicated and converges on solutions more rapidly than traditional frameworks. This technique marks a significant step forward for development of holistic, riverscape-based approaches that balance the conflicting needs of the stakeholders.

  13. Development of the Modes of Collaboration framework

    NASA Astrophysics Data System (ADS)

    Pawlak, Alanna; Irving, Paul W.; Caballero, Marcos D.

    2018-01-01

    Group work is becoming increasingly common in introductory physics classrooms. Understanding how students engage in these group learning environments is important for designing and facilitating productive learning opportunities for students. We conducted a study in which we collected video of groups of students working on conceptual electricity and magnetism problems in an introductory physics course. In this setting, students needed to negotiate a common understanding and coordinate group decisions in order to complete the activity successfully. We observed students interacting in several distinct ways while solving these problems. Analysis of these observations focused on identifying the different ways students interacted and articulating what defines and distinguishes them, resulting in the development of the modes of collaboration framework. The modes of collaboration framework defines student interactions along three dimensions: social, discursive, and disciplinary content. This multidimensional approach offers a unique lens through which to consider group work and provides a flexibility that could allow the framework to be adapted for a variety of contexts. We present the framework and several examples of its application here.

  14. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  15. A Framework for the Study of Emotions in Organizational Contexts.

    ERIC Educational Resources Information Center

    Fiebig, Greg V.; Kramer, Michael W.

    1998-01-01

    Approaches the study of emotions in organizations holistically, based on a proposed framework. Provides descriptive data that suggests the presence of the framework's major elements. States that future examination of emotions based on this framework should assist in understanding emotions, which are frequently ignored in a rational model. (PA)

  16. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    NASA Astrophysics Data System (ADS)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  17. Prologue: Toward Accurate Identification of Developmental Language Disorder Within Linguistically Diverse Schools.

    PubMed

    Oetting, Janna B

    2018-04-05

    Although the 5 studies presented within this clinical forum include children who differ widely in locality, language learning profile, and age, all were motivated by a desire to improve the accuracy at which developmental language disorder is identified within linguistically diverse schools. The purpose of this prologue is to introduce the readers to a conceptual framework that unites the studies while also highlighting the approaches and methods each research team is pursuing to improve assessment outcomes within their respective linguistically diverse community. A disorder within diversity framework is presented to replace previous difference vs. disorder approaches. Then, the 5 studies within the forum are reviewed by clinical question, type of tool(s), and analytical approach. Across studies of different linguistically diverse groups, research teams are seeking answers to similar questions about child language screening and diagnostic practices, using similar analytical approaches to answer their questions, and finding promising results with tools focused on morphosyntax. More studies that are modeled after or designed to extend those in this forum are needed to improve the accuracy at which developmental language disorder is identified.

  18. Computational system identification of continuous-time nonlinear systems using approximate Bayesian computation

    NASA Astrophysics Data System (ADS)

    Krishnanathan, Kirubhakaran; Anderson, Sean R.; Billings, Stephen A.; Kadirkamanathan, Visakan

    2016-11-01

    In this paper, we derive a system identification framework for continuous-time nonlinear systems, for the first time using a simulation-focused computational Bayesian approach. Simulation approaches to nonlinear system identification have been shown to outperform regression methods under certain conditions, such as non-persistently exciting inputs and fast-sampling. We use the approximate Bayesian computation (ABC) algorithm to perform simulation-based inference of model parameters. The framework has the following main advantages: (1) parameter distributions are intrinsically generated, giving the user a clear description of uncertainty, (2) the simulation approach avoids the difficult problem of estimating signal derivatives as is common with other continuous-time methods, and (3) as noted above, the simulation approach improves identification under conditions of non-persistently exciting inputs and fast-sampling. Term selection is performed by judging parameter significance using parameter distributions that are intrinsically generated as part of the ABC procedure. The results from a numerical example demonstrate that the method performs well in noisy scenarios, especially in comparison to competing techniques that rely on signal derivative estimation.

  19. The Spawns of Creative Behavior in Team Sports: A Creativity Developmental Framework.

    PubMed

    Santos, Sara D L; Memmert, Daniel; Sampaio, Jaime; Leite, Nuno

    2016-01-01

    Developing creativity in team sports players is becoming an increasing focus in sports sciences. The Creativity Developmental Framework is presented to provide an updated science based background. This Framework describes five incremental creative stages (beginner, explorer, illuminati, creator, and rise) and combines them into multidisciplinary approaches embodied in creative assumptions. In the first training stages, the emphasis is placed on the enrollment in diversification, deliberate play and physical literacy approaches grounded in nonlinear pedagogies. These approaches allow more freedom to discover different movement patterns increasing the likelihood of emerging novel, adaptive and functional solutions. In the later stages, the progressive specialization in sports and the differential learning commitment are extremely important to push the limits of the creative progress at higher levels of performance by increasing the range of skills configurations. Notwithstanding, during all developmental stages the teaching games for understanding, a game-centered approach, linked with the constraints-led approach play an important role to boost the tactical creative behavior. Both perspectives might encourage players to explore all actions possibilities (improving divergent thinking) and prevents the standardization in their actions. Overall, considering the aforementioned practice conditions the Creativity Developmental Framework scrutinizes the main directions that lead to a long-term improvement of the creative behavior in team sports. Nevertheless, this framework should be seen as a work in progress to be later used as the paramount reference in creativity training.

  20. The Spawns of Creative Behavior in Team Sports: A Creativity Developmental Framework

    PubMed Central

    Santos, Sara D. L.; Memmert, Daniel; Sampaio, Jaime; Leite, Nuno

    2016-01-01

    Developing creativity in team sports players is becoming an increasing focus in sports sciences. The Creativity Developmental Framework is presented to provide an updated science based background. This Framework describes five incremental creative stages (beginner, explorer, illuminati, creator, and rise) and combines them into multidisciplinary approaches embodied in creative assumptions. In the first training stages, the emphasis is placed on the enrollment in diversification, deliberate play and physical literacy approaches grounded in nonlinear pedagogies. These approaches allow more freedom to discover different movement patterns increasing the likelihood of emerging novel, adaptive and functional solutions. In the later stages, the progressive specialization in sports and the differential learning commitment are extremely important to push the limits of the creative progress at higher levels of performance by increasing the range of skills configurations. Notwithstanding, during all developmental stages the teaching games for understanding, a game-centered approach, linked with the constraints-led approach play an important role to boost the tactical creative behavior. Both perspectives might encourage players to explore all actions possibilities (improving divergent thinking) and prevents the standardization in their actions. Overall, considering the aforementioned practice conditions the Creativity Developmental Framework scrutinizes the main directions that lead to a long-term improvement of the creative behavior in team sports. Nevertheless, this framework should be seen as a work in progress to be later used as the paramount reference in creativity training. PMID:27617000

  1. A Simulation Modeling Approach Method Focused on the Refrigerated Warehouses Using Design of Experiment

    NASA Astrophysics Data System (ADS)

    Cho, G. S.

    2017-09-01

    For performance optimization of Refrigerated Warehouses, design parameters are selected based on the physical parameters such as number of equipment and aisles, speeds of forklift for ease of modification. This paper provides a comprehensive framework approach for the system design of Refrigerated Warehouses. We propose a modeling approach which aims at the simulation optimization so as to meet required design specifications using the Design of Experiment (DOE) and analyze a simulation model using integrated aspect-oriented modeling approach (i-AOMA). As a result, this suggested method can evaluate the performance of a variety of Refrigerated Warehouses operations.

  2. Incorporating geologic information into hydraulic tomography: A general framework based on geostatistical approach

    NASA Astrophysics Data System (ADS)

    Zha, Yuanyuan; Yeh, Tian-Chyi J.; Illman, Walter A.; Onoe, Hironori; Mok, Chin Man W.; Wen, Jet-Chau; Huang, Shao-Yang; Wang, Wenke

    2017-04-01

    Hydraulic tomography (HT) has become a mature aquifer test technology over the last two decades. It collects nonredundant information of aquifer heterogeneity by sequentially stressing the aquifer at different wells and collecting aquifer responses at other wells during each stress. The collected information is then interpreted by inverse models. Among these models, the geostatistical approaches, built upon the Bayesian framework, first conceptualize hydraulic properties to be estimated as random fields, which are characterized by means and covariance functions. They then use the spatial statistics as prior information with the aquifer response data to estimate the spatial distribution of the hydraulic properties at a site. Since the spatial statistics describe the generic spatial structures of the geologic media at the site rather than site-specific ones (e.g., known spatial distributions of facies, faults, or paleochannels), the estimates are often not optimal. To improve the estimates, we introduce a general statistical framework, which allows the inclusion of site-specific spatial patterns of geologic features. Subsequently, we test this approach with synthetic numerical experiments. Results show that this approach, using conditional mean and covariance that reflect site-specific large-scale geologic features, indeed improves the HT estimates. Afterward, this approach is applied to HT surveys at a kilometer-scale-fractured granite field site with a distinct fault zone. We find that by including fault information from outcrops and boreholes for HT analysis, the estimated hydraulic properties are improved. The improved estimates subsequently lead to better prediction of flow during a different pumping test at the site.

  3. What do District Health Planners in Tanzania think about improving priority setting using 'Accountability for Reasonableness'?

    PubMed Central

    Mshana, Simon; Shemilu, Haji; Ndawi, Benedict; Momburi, Roman; Olsen, Oystein Evjen; Byskov, Jens; Martin, Douglas K

    2007-01-01

    Background Priority setting in every health system is complex and difficult. In less wealthy countries the dominant approach to priority setting has been Burden of Disease (BOD) and cost-effectiveness analysis (CEA), which is helpful, but insufficient because it focuses on a narrow range of values – need and efficiency – and not the full range of relevant values, including legitimacy and fairness. 'Accountability for reasonableness' is a conceptual framework for legitimate and fair priority setting and is empirically based and ethically justified. It connects priority setting to broader, more fundamental, democratic deliberative processes that have an impact on social justice and equity. Can 'accountability for reasonableness' be helpful for improving priority setting in less wealthy countries? Methods In 2005, Tanzanian scholars from the Primary Health Care Institute (PHCI) conducted 6 capacity building workshops with senior health staff, district planners and managers, and representatives of the Tanzanian Ministry of Health to discussion improving priority setting in Tanzania using 'accountability for reasonableness'. The purpose of this paper is to describe this initiative and the participants' views about the approach. Results The approach to improving priority setting using 'accountability for reasonableness' was viewed by district decision makers with enthusiastic favour because it was the first framework that directly addressed their priority setting concerns. High level Ministry of Health participants were also very supportive of the approach. Conclusion Both Tanzanian district and governmental health planners viewed the 'accountability for reasonableness' approach with enthusiastic favour because it was the first framework that directly addressed their concerns. PMID:17997824

  4. Improve Biomedical Information Retrieval using Modified Learning to Rank Methods.

    PubMed

    Xu, Bo; Lin, Hongfei; Lin, Yuan; Ma, Yunlong; Yang, Liang; Wang, Jian; Yang, Zhihao

    2016-06-14

    In these years, the number of biomedical articles has increased exponentially, which becomes a problem for biologists to capture all the needed information manually. Information retrieval technologies, as the core of search engines, can deal with the problem automatically, providing users with the needed information. However, it is a great challenge to apply these technologies directly for biomedical retrieval, because of the abundance of domain specific terminologies. To enhance biomedical retrieval, we propose a novel framework based on learning to rank. Learning to rank is a series of state-of-the-art information retrieval techniques, and has been proved effective in many information retrieval tasks. In the proposed framework, we attempt to tackle the problem of the abundance of terminologies by constructing ranking models, which focus on not only retrieving the most relevant documents, but also diversifying the searching results to increase the completeness of the resulting list for a given query. In the model training, we propose two novel document labeling strategies, and combine several traditional retrieval models as learning features. Besides, we also investigate the usefulness of different learning to rank approaches in our framework. Experimental results on TREC Genomics datasets demonstrate the effectiveness of our framework for biomedical information retrieval.

  5. Decoding the integrated approach to yoga therapy: Qualitative evidence based conceptual framework

    PubMed Central

    Villacres, Maria Del Carmen; Jagannathan, Aarti; Nagarathna, R; Ramakrsihna, Jayashree

    2014-01-01

    Aim: The aim of this study was to define, decode, and append to the conceptual frame-work of the integrated approach to yoga therapy (IAYT). Materials and Methods: Four stakeholders who followed two in-patients with depression over a period of 2 weeks in the residential center Arogyadhama (of Swami Vivekananda Yoga Anusandana Samsthana, Bangalore, India) were interviewed before the start of the IAYT treatment and prior to discharge of the patient. The patients were also interviewed pre and post and were observed once during their session. The data from the audio recordings from eight in-depth interviews were transcribed manually and qualitative analysis was conducted. Results: The conceptual frame-work of IAYT depicts that patient related factors (“co-operation of patient”, “patients awareness of his/her condition”), therapist related factors (“ability to guide”, “the assistance to the patients”, “explanation of the exercises”) and treatment related factors (“combination of psychiatric or Ayurvedic medication with yoga”, “counseling during the IAYT treatment”, duration of treatment), play an integrated role in reaching the “aim of IAYT” and experiencing “improvements and changes”. Conclusion: The IAYT is a holistic program and the ability of the patient to cooperate with and integrate the available factors (therapist related and treatment related) could enable best results. PMID:25035604

  6. Consumer trust in food safety--a multidisciplinary approach and empirical evidence from Taiwan.

    PubMed

    Chen, Mei-Fang

    2008-12-01

    Food scandals that happened in recent years have increased consumers' risk perceptions of foods and decreased their trust in food safety. A better understanding of the consumer trust in food safety can improve the effectiveness of public policy and allow the development of the best practice in risk communication. This study proposes a research framework from a psychometric approach to investigate the relationships between the consumer's trust in food safety and the antecedents of risk perceptions of foods based on a reflexive modernization perspective and a cultural theory perspective in the hope of benefiting the future empirical study. The empirical results from a structural equation modeling analysis of Taiwan as a case in point reveal that this research framework based on a multidisciplinary perspective can be a valuable tool for a growing understanding of consumer trust in food safety. The antecedents in the psychometric research framework comprised reflexive modernization factors and cultural theory factors have all been supported in this study except the consumer's perception of pessimism toward food. Moreover, the empirical results of repeated measures analysis of variance give more detailed information to grasp empirical implications and to provide some suggestions to the actors and institutions involved in the food supply chain in Taiwan.

  7. A multi-resolution strategy for a multi-objective deformable image registration framework that accommodates large anatomical differences

    NASA Astrophysics Data System (ADS)

    Alderliesten, Tanja; Bosman, Peter A. N.; Sonke, Jan-Jakob; Bel, Arjan

    2014-03-01

    Currently, two major challenges dominate the field of deformable image registration. The first challenge is related to the tuning of the developed methods to specific problems (i.e. how to best combine different objectives such as similarity measure and transformation effort). This is one of the reasons why, despite significant progress, clinical implementation of such techniques has proven to be difficult. The second challenge is to account for large anatomical differences (e.g. large deformations, (dis)appearing structures) that occurred between image acquisitions. In this paper, we study a framework based on multi-objective optimization to improve registration robustness and to simplify tuning for specific applications. Within this framework we specifically consider the use of an advanced model-based evolutionary algorithm for optimization and a dual-dynamic transformation model (i.e. two "non-fixed" grids: one for the source- and one for the target image) to accommodate for large anatomical differences. The framework computes and presents multiple outcomes that represent efficient trade-offs between the different objectives (a so-called Pareto front). In image processing it is common practice, for reasons of robustness and accuracy, to use a multi-resolution strategy. This is, however, only well-established for single-objective registration methods. Here we describe how such a strategy can be realized for our multi-objective approach and compare its results with a single-resolution strategy. For this study we selected the case of prone-supine breast MRI registration. Results show that the well-known advantages of a multi-resolution strategy are successfully transferred to our multi-objective approach, resulting in superior (i.e. Pareto-dominating) outcomes.

  8. A unified framework for approximation in inverse problems for distributed parameter systems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Ito, K.

    1988-01-01

    A theoretical framework is presented that can be used to treat approximation techniques for very general classes of parameter estimation problems involving distributed systems that are either first or second order in time. Using the approach developed, one can obtain both convergence and stability (continuous dependence of parameter estimates with respect to the observations) under very weak regularity and compactness assumptions on the set of admissible parameters. This unified theory can be used for many problems found in the recent literature and in many cases offers significant improvements to existing results.

  9. History and utility of zeolite framework-type discovery from a data-science perspective

    DOE PAGES

    Zimmermann, Nils E. R.; Haranczyk, Maciej

    2016-05-02

    Mature applications such as fluid catalytic cracking and hydrocracking rely critically on early zeolite structures. With a data-driven approach, we find that the discovery of exceptional zeolite framework types around the new millennium was spurred by exciting new utilization routes. The promising processes have yet not been successfully implemented (“valley of death” effect), mainly because of the lack of thermal stability of the crystals. As a result, this foreshadows limited deployability of recent zeolite discoveries that were achieved by novel crystal synthesis routes.

  10. Uncertainty Analysis of Consequence Management (CM) Data Products.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  11. A framework for evaluating complex networks measurements

    NASA Astrophysics Data System (ADS)

    Comin, Cesar H.; Silva, Filipi N.; Costa, Luciano da F.

    2015-06-01

    A good deal of current research in complex networks involves the characterization and/or classification of the topological properties of given structures, which has motivated several respective measurements. This letter proposes a framework for evaluating the quality of complex-network measurements in terms of their effective resolution, degree of degeneracy and discriminability. The potential of the suggested approach is illustrated with respect to comparing the characterization of several model and real-world networks by using concentric and symmetry measurements. The results indicate a markedly superior performance for the latter type of mapping.

  12. A Bayesian framework for adaptive selection, calibration, and validation of coarse-grained models of atomistic systems

    NASA Astrophysics Data System (ADS)

    Farrell, Kathryn; Oden, J. Tinsley; Faghihi, Danial

    2015-08-01

    A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.

  13. Crossing boundaries in interprofessional education: A call for instructional integration of two script concepts.

    PubMed

    Kiesewetter, Jan; Kollar, Ingo; Fernandez, Nicolas; Lubarsky, Stuart; Kiessling, Claudia; Fischer, Martin R; Charlin, Bernard

    2016-09-01

    Clinical work occurs in a context which is heavily influenced by social interactions. The absence of theoretical frameworks underpinning the design of collaborative learning has become a roadblock for interprofessional education (IPE). This article proposes a script-based framework for the design of IPE. This framework provides suggestions for designing learning environments intended to foster competences we feel are fundamental to successful interprofessional care. The current literature describes two script concepts: "illness scripts" and "internal/external collaboration scripts". Illness scripts are specific knowledge structures that link general disease categories and specific examples of diseases. "Internal collaboration scripts" refer to an individual's knowledge about how to interact with others in a social situation. "External collaboration scripts" are instructional scaffolds designed to help groups collaborate. Instructional research relating to illness scripts and internal collaboration scripts supports (a) putting learners in authentic situations in which they need to engage in clinical reasoning, and (b) scaffolding their interaction with others with "external collaboration scripts". Thus, well-established experiential instructional approaches should be combined with more fine-grained script-based scaffolding approaches. The resulting script-based framework offers instructional designers insights into how students can be supported to develop the necessary skills to master complex interprofessional clinical situations.

  14. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    DOE PAGES

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; ...

    2013-01-01

    Background . The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective . To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods . The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expertmore » knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results . The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions . Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less

  15. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    PubMed Central

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Varnum, Susan M.; Brown, Joseph N.; Riensche, Roderick M.; Adkins, Joshua N.; Jacobs, Jon M.; Hoidal, John R.; Scholand, Mary Beth; Pounds, Joel G.; Blackburn, Michael R.; Rodland, Karin D.; McDermott, Jason E.

    2013-01-01

    Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification. PMID:24223463

  16. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.

    2013-10-01

    Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integratedmore » into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less

  17. An environmental decision framework applied to marine engine control technologies.

    PubMed

    Corbett, James J; Chapman, David

    2006-06-01

    This paper develops a decision framework for considering emission control technologies on marine engines, informed by standard decision theory, with an open structure that may be adapted by operators with specific vessel and technology attributes different from those provided here. Attributes relate objectives important to choosing control technologies with specific alternatives that may meet several of the objectives differently. The transparent framework enables multiple stakeholders to understand how different subjective judgments and varying attribute properties may result in different technology choices. Standard scoring techniques ensure that attributes are not biased by subjective scoring and that weights are the primary quantitative input where subjective preferences are exercised. An expected value decision structure is adopted that considers probabilities (likelihood) that a given alternative can meet its claims; alternative decision criteria are discussed. Capital and annual costs are combined using a net present value approach. An iterative approach is advocated that allows for screening and disqualifying alternatives that do not meet minimum conditions for acceptance, such as engine warranty or U.S. Coast Guard requirements. This decision framework assists vessel operators in considering explicitly important attributes and in representing choices clearly to other stakeholders concerned about reducing air pollution from vessels. This general decision structure may also be applied similarly to other environmental controls in marine applications.

  18. Advancing the adverse outcome pathway framework and its ...

    EPA Pesticide Factsheets

    Regulatory agencies worldwide are confronted with the challenging task of assessing the risks of thousands of chemicals to protect both human health and the environment. Traditional toxicity testing largely relies on apical endpoints from whole animal studies, which, in addition to ethical concerns, is costly and time prohibitive. As a result, the utility of mechanism-based in silico, in vitro, and in vivo approaches to support chemical safety evaluations have increasingly been explored. An approach that has gained traction for capturing available knowledge describing the linkage between mechanistic data and apical toxicity endpoints, required for regulatory assessments, is the adverse outcome pathway (AOP) framework. A number of international workshops and expert meetings have been held over the past years focusing on the AOP framework and its applications to chemical risk assessment. Although, these interactions have illustrated the necessity of expert guidance in moving the science of AOPs and their applications forward, there is also the recognition that a broader survey of the scientific community could be useful in guiding future initiatives in the AOP arena. To that end, a Horizon Scanning exercise was conducted to solicit questions from the global scientific community concerning the challenges or limitations that must be addressed in order to realize the full potential of the AOP framework in research and regulatory decision making. Over a 4 month ques

  19. Landmark-based deep multi-instance learning for brain disease diagnosis.

    PubMed

    Liu, Mingxia; Zhang, Jun; Adeli, Ehsan; Shen, Dinggang

    2018-01-01

    In conventional Magnetic Resonance (MR) image based methods, two stages are often involved to capture brain structural information for disease diagnosis, i.e., 1) manually partitioning each MR image into a number of regions-of-interest (ROIs), and 2) extracting pre-defined features from each ROI for diagnosis with a certain classifier. However, these pre-defined features often limit the performance of the diagnosis, due to challenges in 1) defining the ROIs and 2) extracting effective disease-related features. In this paper, we propose a landmark-based deep multi-instance learning (LDMIL) framework for brain disease diagnosis. Specifically, we first adopt a data-driven learning approach to discover disease-related anatomical landmarks in the brain MR images, along with their nearby image patches. Then, our LDMIL framework learns an end-to-end MR image classifier for capturing both the local structural information conveyed by image patches located by landmarks and the global structural information derived from all detected landmarks. We have evaluated our proposed framework on 1526 subjects from three public datasets (i.e., ADNI-1, ADNI-2, and MIRIAD), and the experimental results show that our framework can achieve superior performance over state-of-the-art approaches. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Figure-Ground Segmentation Using Factor Graphs

    PubMed Central

    Shen, Huiying; Coughlan, James; Ivanchenko, Volodymyr

    2009-01-01

    Foreground-background segmentation has recently been applied [26,12] to the detection and segmentation of specific objects or structures of interest from the background as an efficient alternative to techniques such as deformable templates [27]. We introduce a graphical model (i.e. Markov random field)-based formulation of structure-specific figure-ground segmentation based on simple geometric features extracted from an image, such as local configurations of linear features, that are characteristic of the desired figure structure. Our formulation is novel in that it is based on factor graphs, which are graphical models that encode interactions among arbitrary numbers of random variables. The ability of factor graphs to express interactions higher than pairwise order (the highest order encountered in most graphical models used in computer vision) is useful for modeling a variety of pattern recognition problems. In particular, we show how this property makes factor graphs a natural framework for performing grouping and segmentation, and demonstrate that the factor graph framework emerges naturally from a simple maximum entropy model of figure-ground segmentation. We cast our approach in a learning framework, in which the contributions of multiple grouping cues are learned from training data, and apply our framework to the problem of finding printed text in natural scenes. Experimental results are described, including a performance analysis that demonstrates the feasibility of the approach. PMID:20160994

  1. The application of the Practitioners in Applied Practice Model during breaking bad news communication training for medical students: a case study.

    PubMed

    Dunning, Rose; Laidlaw, Anita

    2015-11-01

    Breaking bad news is a key skill within clinical communication and one which can impact outcomes for both the patient and practitioner. The evidence base for effective clinical communication training in breaking bad news is scarce. Frameworks have been found to assist the practitioner, such as SPIKES; however, the pedagogical approach used alongside such frameworks can vary. This study sought to examine the impact of utilising the Practitioners in Applied Practice Model (PAPM) alongside the SPIKES framework for training undergraduate medical students in breaking bad news. A case study approach is used to highlight the impact of training based on the PAPM and SPIKES on patient-centred communication and simulated patient satisfaction with the clinical communication behaviour. Results showed that following training, both patient-centred behaviour and patient satisfaction improved. With detailed communication behaviour changes, a balance was established between rapport building behaviour, lifestyle and psychosocial talk alongside biomedical information. This case study shows how the PAPM could be utilised alongside the SPIKES framework to improve breaking bad news communication in medical undergraduate students and describes the behavioural basis of the improvement. Further research is required to show the generalisability of this training intervention. © The Author(s) 2015.

  2. Perspectives on the Origins of Life in Science Textbooks from a Christian Publisher: Implications for Teaching Science

    ERIC Educational Resources Information Center

    Santos Baptista, Geilsa Costa; da Silva Santos, Rodrigo; Cobern, William W.

    2016-01-01

    This paper presents the results of research regarding approaches to the origin of life featured in science textbooks produced by an Evangelical publisher. The research nature was qualitative with document analysis and an interpretive framework based on Epistemological Pluralism. Overall, the results indicate that there are four perspectives on the…

  3. A closer look at the FTEM framework. Response to "More of the same? Comment on 'An integrated framework for the optimisation of sport and athlete development: a practitioner approach'".

    PubMed

    Gulbin, Jason P; Croser, Morag J; Morley, Elissa J; Weissensteiner, Juanita R

    2014-01-01

    The Foundations, Talent, Elite and Mastery (FTEM) framework was designed through the lens of a world leading high-performance sport agency to assist sporting stakeholders operationalise and research their whole of sport development pathways (Gulbin, J. P., Croser, M. J., Morley, E. J., & Weissensteiner, J. R. (2013). An integrated framework for the optimisation of sport and athlete development: A practitioner approach. Journal of Sport Sciences, 31, 1319-1331). In response to the commentary by MacNamara and Collins (2013) (Journal of Sports Sciences, doi:10.1080/02640414.2013. 855805), it was possible to document many inaccurate, false and misleading statements based on inattentive reading of the original article. We reinforce that: FTEM is a holistic framework of sport and athlete development and not a surrogate for a talent identification ( TID) model; bio-psycho-social components of development are liberally embedded throughout the FTEM framework; and the combined research and applied insights of development practitioners provide strong ecological validity for the consideration of stakeholders looking to explore applied approaches to athlete pathway management.

  4. Locomotion Dynamics for Bio-inspired Robots with Soft Appendages: Application to Flapping Flight and Passive Swimming

    NASA Astrophysics Data System (ADS)

    Boyer, Frédéric; Porez, Mathieu; Morsli, Ferhat; Morel, Yannick

    2017-08-01

    In animal locomotion, either in fish or flying insects, the use of flexible terminal organs or appendages greatly improves the performance of locomotion (thrust and lift). In this article, we propose a general unified framework for modeling and simulating the (bio-inspired) locomotion of robots using soft organs. The proposed approach is based on the model of Mobile Multibody Systems (MMS). The distributed flexibilities are modeled according to two major approaches: the Floating Frame Approach (FFA) and the Geometrically Exact Approach (GEA). Encompassing these two approaches in the Newton-Euler modeling formalism of robotics, this article proposes a unique modeling framework suited to the fast numerical integration of the dynamics of a MMS in both the FFA and the GEA. This general framework is applied on two illustrative examples drawn from bio-inspired locomotion: the passive swimming in von Karman Vortex Street, and the hovering flight with flexible flapping wings.

  5. A 'systems' approach to suicide prevention: radical change or doing the same things better?

    PubMed

    Fitzpatrick, Scott J; Hooker, Claire

    2017-04-27

    Suicide is a significant public health concern. Continued high suicide rates, coupled with emerging international evidence, have led to the development of a 'systems' approach to suicide prevention, which is now being trialled as part of a proposed Suicide Prevention Framework for NSW (New South Wales, Australia). The Framework replicates successful international approaches. It is organised around nine components, ranging from individual to population-level approaches, to improve coordination and integration of existing services. If implemented fully, the Framework may lead to a significant reduction in suicide. However, to ensure its long-term success, we must attend to underlying structures within the system and their interrelationships. Such an approach will also ensure that policy makers and local suicide prevention action groups, particularly in rural areas, are able to respond to local challenges and incorporate multiple perspectives into their practice, including evidence for the broader social determinants of suicide.

  6. Narrative Approaches to Organizational Development: A Case Study of Implementation of Collaborative Helping.

    PubMed

    Madsen, William C

    2016-06-01

    Across North America, community agencies and state/provincial jurisdictions are embracing family-centered approaches to service delivery that are grounded in strength-based, culturally responsive, accountable partnerships with families. This article details a collaborative consultation process to initiate and sustain organizational change toward this effort. It draws on innovative ideas from narrative theory, organizational development, and implementation science to highlight a three component approach. This approach includes the use of appreciative inquiry focus groups to elicit existing best practices, the provision of clinical training, and ongoing coaching with practice leaders to build on those better moments and develop concrete practice frameworks, and leadership coaching and organizational consultation to develop organizational structures that institutionalize family-centered practice. While the article uses a principle-based practice framework, Collaborative Helping, to illustrate this process, the approach is applicable with a variety of clinical frameworks grounded in family-centered values and principles. © 2016 Family Process Institute.

  7. Ab initio relaxation times and time-dependent Hamiltonians within the steepest-entropy-ascent quantum thermodynamic framework

    NASA Astrophysics Data System (ADS)

    Kim, Ilki; von Spakovsky, Michael R.

    2017-08-01

    Quantum systems driven by time-dependent Hamiltonians are considered here within the framework of steepest-entropy-ascent quantum thermodynamics (SEAQT) and used to study the thermodynamic characteristics of such systems. In doing so, a generalization of the SEAQT framework valid for all such systems is provided, leading to the development of an ab initio physically relevant expression for the intrarelaxation time, an important element of this framework and one that had as of yet not been uniquely determined as an integral part of the theory. The resulting expression for the relaxation time is valid as well for time-independent Hamiltonians as a special case and makes the description provided by the SEAQT framework more robust at the fundamental level. In addition, the SEAQT framework is used to help resolve a fundamental issue of thermodynamics in the quantum domain, namely, that concerning the unique definition of process-dependent work and heat functions. The developments presented lead to the conclusion that this framework is not just an alternative approach to thermodynamics in the quantum domain but instead one that uniquely sheds new light on various fundamental but as of yet not completely resolved questions of thermodynamics.

  8. Geocapabilities: Toward an International Framework for Researching the Purposes and Values of Geography Education

    ERIC Educational Resources Information Center

    Solem, Michael; Lambert, David; Tani, Sirpa

    2013-01-01

    GeoCapabilities is a transatlantic collaborative project for researching the purposes and values of geography education through a "capabilities approach." Inspired by the writings of philosopher Amartya Sen and economist Martha Nussbaum, the capabilities approach provides a normative framework for understanding the broader aims of…

  9. New Educational Services Development: Framework for Technology Entrepreneurship Education at Universities in Egypt

    ERIC Educational Resources Information Center

    Abou-Warda, Sherein Hamed

    2016-01-01

    Purpose: The overall objective of the current study is to explore how universities can better developing new educational services. The purpose of this paper is to develop framework for technology entrepreneurship education (TEPE) within universities. Design/Methodology/Approach: Qualitative and quantitative research approaches were employed. This…

  10. A Systematic Framework of Virtual Laboratories Using Mobile Agent and Design Pattern Technologies

    ERIC Educational Resources Information Center

    Li, Yi-Hsung; Dow, Chyi-Ren; Lin, Cheng-Min; Chen, Sheng-Chang; Hsu, Fu-Wei

    2009-01-01

    Innovations in network and information technology have transformed traditional classroom lectures into new approaches that have given universities the opportunity to create a virtual laboratory. However, there is no systematic framework in existing approaches for the development of virtual laboratories. Further, developing a virtual laboratory…

  11. An Integrated Approach to Disability Policy Development, Implementation, and Evaluation

    ERIC Educational Resources Information Center

    Shogren, Karrie A.; Luckasson, Ruth; Schalock, Robert L.

    2017-01-01

    This article provides a framework for an integrated approach to disability policy development, implementation, and evaluation. The article discusses how a framework that combines systems thinking and valued outcomes can be used by coalition partners across ecological systems to implement disability policy, promote the effective use of resources,…

  12. Best Practices Inquiry: A Multidimensional, Value-Critical Framework

    ERIC Educational Resources Information Center

    Petr, Christopher G.; Walter, Uta M.

    2005-01-01

    This article offers a multidimensional framework that broadens current approaches to "best practices" inquiry to include (1) the perspectives of both the consumers of services and professional practitioners and (2) a value-based critique. The predominant empirical approach to best practices inquiry is a necessary, but not sufficient, component of…

  13. Teaching Conflict Management Using a Scenario-Based Approach

    ERIC Educational Resources Information Center

    Callanan, Gerard A.; Perri, David F.

    2006-01-01

    In this article, the authors present a framework for the teaching of conflict management in college courses. The framework describes an experiential learning approach for helping individuals understand the influence of contextual factors in the selection of conflict handling strategy. It also includes a comparison of participants' choice of style,…

  14. Communication, Constructivism, and Transfer of Knowledge in the Education of Bilingual Learners.

    ERIC Educational Resources Information Center

    Olivares, Rafael A.

    2002-01-01

    Discusses a theoretical framework to educate bilingual learners that links the communicative approach and the constructivist approach to learning with the transfer of knowledge from one language to another. The framework is illustrated in the communication, constructivism, and transference of knowledge (CCT) model where bilingual students use…

  15. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development

    PubMed Central

    2014-01-01

    Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on average 15% of the mean values over the succeeding parameter sets. Conclusions Our results indicate that the presented approach is effective for comparing model alternatives and reducing models to the minimum complexity replicating measured data. We therefore believe that this approach has significant potential for reparameterising existing frameworks, for identification of redundant model components of large biophysical models and to increase their predictive capacity. PMID:24886522

  16. A framework for outcome-level evaluation of in-service training of health care workers

    PubMed Central

    2013-01-01

    Background In-service training is a key strategic approach to addressing the severe shortage of health care workers in many countries. However, there is a lack of evidence linking these health care worker trainings to improved health outcomes. In response, the United States President’s Emergency Plan for AIDS Relief’s Human Resources for Health Technical Working Group initiated a project to develop an outcome-focused training evaluation framework. This paper presents the methods and results of that project. Methods A general inductive methodology was used for the conceptualization and development of the framework. Fifteen key informant interviews were conducted to explore contextual factors, perceived needs, barriers and facilitators affecting the evaluation of training outcomes. In addition, a thematic analysis of 70 published articles reporting health care worker training outcomes identified key themes and categories. These were integrated, synthesized and compared to several existing training evaluation models. This formed an overall typology which was used to draft a new framework. Finally, the framework was refined and validated through an iterative process of feedback, pilot testing and revision. Results The inductive process resulted in identification of themes and categories, as well as relationships among several levels and types of outcomes. The resulting framework includes nine distinct types of outcomes that can be evaluated, which are organized within three nested levels: individual, organizational and health system/population. The outcome types are: (1) individual knowledge, attitudes and skills; (2) individual performance; (3) individual patient health; (4) organizational systems; (5) organizational performance; (6) organizational-level patient health; (7) health systems; (8) population-level performance; and (9) population-level health. The framework also addresses contextual factors which may influence the outcomes of training, as well as the ability of evaluators to determine training outcomes. In addition, a group of user-friendly resources, the Training Evaluation Framework and Tools (TEFT) were created to help evaluators and stakeholders understand and apply the framework. Conclusions Feedback from pilot users suggests that using the framework and accompanying tools may support outcome evaluation planning. Further assessment will assist in strengthening guidelines and tools for operationalization. PMID:24083635

  17. Fraying connections of caring women: an exemplar of including difference in the development of explanatory frameworks.

    PubMed

    Wuest, J

    1997-01-01

    While research exploring diverse groups enhances understanding of their unique perspectives and experiences, it also contributes to the exclusion of such groups from mainstream frameworks and solutions. The feminist grounded theory method allows for inclusion of marginalized groups through theoretical sensitivity to feminist theory and theoretical sampling. This paper demonstrates how this approach results in an explanatory framework that accounts for diverse realities in a study of women's caring. Fraying connections were identified as women's initial response to competing and changing caring demands. The range of dimensions and properties of fraying connections was identified through theoretical sampling guided by the emerging themes and theoretical sensitivity to issues of gender, culture, age, ability, class, and sexual orientation.

  18. Clinical research with children: the European legal framework and its implementation in French and Italian law.

    PubMed

    Altavilla, Annagrazia

    2008-07-01

    According to the International Convention of the Rights of the Child, an improvement of the protection of the rights of children in Europe should be accomplished by inserting the principle of best interests and evolving capacities in the legal framework related to paediatric clinical research. In this article, an overview is given of the European legal framework governing clinical research on minors in a comparative approach. The lack of coordination between different International and European ethical/ legal statements and its impact on national legislations is evaluated by analyzing provisions that have been foreseen in Italy and in France as a result of the ratification/implementation process. A presentation of the perspectives of paediatric research in Europe is provided.

  19. A colour image reproduction framework for 3D colour printing

    NASA Astrophysics Data System (ADS)

    Xiao, Kaida; Sohiab, Ali; Sun, Pei-li; Yates, Julian M.; Li, Changjun; Wuerger, Sophie

    2016-10-01

    In this paper, the current technologies in full colour 3D printing technology were introduced. A framework of colour image reproduction process for 3D colour printing is proposed. A special focus was put on colour management for 3D printed objects. Two approaches, colorimetric colour reproduction and spectral based colour reproduction are proposed in order to faithfully reproduce colours in 3D objects. Two key studies, colour reproduction for soft tissue prostheses and colour uniformity correction across different orientations are described subsequently. Results are clear shown that applying proposed colour image reproduction framework, performance of colour reproduction can be significantly enhanced. With post colour corrections, a further improvement in colour process are achieved for 3D printed objects.

  20. A proposed framework on hybrid feature selection techniques for handling high dimensional educational data

    NASA Astrophysics Data System (ADS)

    Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd

    2017-10-01

    Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.

  1. Multiobjective optimization of temporal processes.

    PubMed

    Song, Zhe; Kusiak, Andrew

    2010-06-01

    This paper presents a dynamic predictive-optimization framework of a nonlinear temporal process. Data-mining (DM) and evolutionary strategy algorithms are integrated in the framework for solving the optimization model. DM algorithms learn dynamic equations from the process data. An evolutionary strategy algorithm is then applied to solve the optimization problem guided by the knowledge extracted by the DM algorithm. The concept presented in this paper is illustrated with the data from a power plant, where the goal is to maximize the boiler efficiency and minimize the limestone consumption. This multiobjective optimization problem can be either transformed into a single-objective optimization problem through preference aggregation approaches or into a Pareto-optimal optimization problem. The computational results have shown the effectiveness of the proposed optimization framework.

  2. Correlation between the Availability of Resources and Efficiency of the School System within the Framework of the Implementation of Competency-Based Teaching Approaches in Cameroon

    ERIC Educational Resources Information Center

    Esongo, Njie Martin

    2017-01-01

    The study takes an in-depth examination of the extent to which the availability of resources relates to the efficiency of the school system within the framework of the implementation of competency-based teaching approaches in Cameroon. The study employed a mix of probability sampling approaches, namely simple, cluster and stratified random…

  3. Social Exclusion and Education Inequality: Towards an Integrated Analytical Framework for the Urban-Rural Divide in China

    ERIC Educational Resources Information Center

    Wang, Li

    2012-01-01

    The aim of this paper is to build a capability-based framework, drawing upon the strengths of other approaches, which is applicable to the complexity of the urban-rural divide in education in China. It starts with a brief introduction to the capability approach. This is followed by a discussion of how the rights-based approach and resource-based…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solis, John Hector

    In this paper, we present a modular framework for constructing a secure and efficient program obfuscation scheme. Our approach, inspired by the obfuscation with respect to oracle machines model of [4], retains an interactive online protocol with an oracle, but relaxes the original computational and storage restrictions. We argue this is reasonable given the computational resources of modern personal devices. Furthermore, we relax the information-theoretic security requirement for computational security to utilize established cryptographic primitives. With this additional flexibility we are free to explore different cryptographic buildingblocks. Our approach combines authenticated encryption with private information retrieval to construct a securemore » program obfuscation framework. We give a formal specification of our framework, based on desired functionality and security properties, and provide an example instantiation. In particular, we implement AES in Galois/Counter Mode for authenticated encryption and the Gentry-Ramzan [13]constant communication-rate private information retrieval scheme. We present our implementation results and show that non-trivial sized programs can be realized, but scalability is quickly limited by computational overhead. Finally, we include a discussion on security considerations when instantiating specific modules.« less

  5. Optimization-Based Sensor Fusion of GNSS and IMU Using a Moving Horizon Approach

    PubMed Central

    Girrbach, Fabian; Hol, Jeroen D.; Bellusci, Giovanni; Diehl, Moritz

    2017-01-01

    The rise of autonomous systems operating close to humans imposes new challenges in terms of robustness and precision on the estimation and control algorithms. Approaches based on nonlinear optimization, such as moving horizon estimation, have been shown to improve the accuracy of the estimated solution compared to traditional filter techniques. This paper introduces an optimization-based framework for multi-sensor fusion following a moving horizon scheme. The framework is applied to the often occurring estimation problem of motion tracking by fusing measurements of a global navigation satellite system receiver and an inertial measurement unit. The resulting algorithm is used to estimate position, velocity, and orientation of a maneuvering airplane and is evaluated against an accurate reference trajectory. A detailed study of the influence of the horizon length on the quality of the solution is presented and evaluated against filter-like and batch solutions of the problem. The versatile configuration possibilities of the framework are finally used to analyze the estimated solutions at different evaluation times exposing a nearly linear behavior of the sensor fusion problem. PMID:28534857

  6. Integrated presentation of ecological risk from multiple stressors

    NASA Astrophysics Data System (ADS)

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-10-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  7. Integrated presentation of ecological risk from multiple stressors.

    PubMed

    Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman

    2016-10-26

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  8. Optimization-Based Sensor Fusion of GNSS and IMU Using a Moving Horizon Approach.

    PubMed

    Girrbach, Fabian; Hol, Jeroen D; Bellusci, Giovanni; Diehl, Moritz

    2017-05-19

    The rise of autonomous systems operating close to humans imposes new challenges in terms of robustness and precision on the estimation and control algorithms. Approaches based on nonlinear optimization, such as moving horizon estimation, have been shown to improve the accuracy of the estimated solution compared to traditional filter techniques. This paper introduces an optimization-based framework for multi-sensor fusion following a moving horizon scheme. The framework is applied to the often occurring estimation problem of motion tracking by fusing measurements of a global navigation satellite system receiver and an inertial measurement unit. The resulting algorithm is used to estimate position, velocity, and orientation of a maneuvering airplane and is evaluated against an accurate reference trajectory. A detailed study of the influence of the horizon length on the quality of the solution is presented and evaluated against filter-like and batch solutions of the problem. The versatile configuration possibilities of the framework are finally used to analyze the estimated solutions at different evaluation times exposing a nearly linear behavior of the sensor fusion problem.

  9. UNITY: Confronting Supernova Cosmology's Statistical and Systematic Uncertainties in a Unified Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The

    2015-11-01

    While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.

  10. Detecting spatial patterns of rivermouth processes using a geostatistical framework for near-real-time analysis

    USGS Publications Warehouse

    Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara

    2017-01-01

    This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.

  11. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less

  12. Developing a Modeling Framework for Ecosystem Forecasting: The Lake Michigan Pilot

    EPA Science Inventory

    Recent multi-party efforts to coordinate modeling activities that support ecosystem management decision-making in the Great Lakes have resulted in the recommendation to convene an interagency working group that will develop a pilot approach for Lake Michigan. The process will br...

  13. Framework for modeling urban restoration resilience time in the aftermath of an extreme event

    USGS Publications Warehouse

    Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Héctor

    2015-01-01

    The impacts of extreme events continue long after the emergency response has terminated. Effective reconstruction of supply-chain strategic infrastructure (SCSI) elements is essential for postevent recovery and the reconnectivity of a region with the outside. This study uses an interdisciplinary approach to develop a comprehensive framework to model resilience time. The framework is tested by comparing resilience time results for a simulated EF-5 tornado with ground truth data from the tornado that devastated Joplin, Missouri, on May 22, 2011. Data for the simulated tornado were derived for Overland Park, Johnson County, Kansas, in the greater Kansas City, Missouri, area. Given the simulated tornado, a combinatorial graph considering the damages in terms of interconnectivity between different SCSI elements is derived. Reconstruction in the aftermath of the simulated tornado is optimized using the proposed framework to promote a rapid recovery of the SCSI. This research shows promising results when compared with the independent quantifiable data obtained from Joplin, Missouri, returning a resilience time of 22 days compared with 25 days reported by city and state officials.

  14. A game theory analysis of green infrastructure stormwater management policies

    NASA Astrophysics Data System (ADS)

    William, Reshmina; Garg, Jugal; Stillwell, Ashlynn S.

    2017-09-01

    Green stormwater infrastructure has been demonstrated as an innovative water resources management approach that addresses multiple challenges facing urban environments. However, there is little consensus on what policy strategies can be used to best incentivize green infrastructure adoption by private landowners. Game theory, an analysis framework that has historically been under-utilized within the context of stormwater management, is uniquely suited to address this policy question. We used a cooperative game theory framework to investigate the potential impacts of different policy strategies used to incentivize green infrastructure installation. The results indicate that municipal regulation leads to the greatest reduction in pollutant loading. However, the choice of the "best" regulatory approach will depend on a variety of different factors including politics and financial considerations. Large, downstream agents have a disproportionate share of bargaining power. Results also reveal that policy impacts are highly dependent on agents' spatial position within the stormwater network, leading to important questions of social equity and environmental justice.

  15. Game theoretic approach for cooperative feature extraction in camera networks

    NASA Astrophysics Data System (ADS)

    Redondi, Alessandro E. C.; Baroffio, Luca; Cesana, Matteo; Tagliasacchi, Marco

    2016-07-01

    Visual sensor networks (VSNs) consist of several camera nodes with wireless communication capabilities that can perform visual analysis tasks such as object identification, recognition, and tracking. Often, VSN deployments result in many camera nodes with overlapping fields of view. In the past, such redundancy has been exploited in two different ways: (1) to improve the accuracy/quality of the visual analysis task by exploiting multiview information or (2) to reduce the energy consumed for performing the visual task, by applying temporal scheduling techniques among the cameras. We propose a game theoretic framework based on the Nash bargaining solution to bridge the gap between the two aforementioned approaches. The key tenet of the proposed framework is for cameras to reduce the consumed energy in the analysis process by exploiting the redundancy in the reciprocal fields of view. Experimental results in both simulated and real-life scenarios confirm that the proposed scheme is able to increase the network lifetime, with a negligible loss in terms of visual analysis accuracy.

  16. The Step approach to Message Design and Testing (SatMDT): A conceptual framework to guide the development and evaluation of persuasive health messages.

    PubMed

    Lewis, Ioni; Watson, Barry; White, Katherine M

    2016-12-01

    This paper provides an important and timely overview of a conceptual framework designed to assist with the development of message content, as well as the evaluation, of persuasive health messages. While an earlier version of this framework was presented in a prior publication by the authors in 2009, important refinements to the framework have seen it evolve in recent years, warranting the need for an updated review. This paper outlines the Step approach to Message Design and Testing (or SatMDT) in accordance with the theoretical evidence which underpins, as well as empirical evidence which demonstrates the relevance and feasibility of, each of the framework's steps. The development and testing of the framework have thus far been based exclusively within the road safety advertising context; however, the view expressed herein is that the framework may have broader appeal and application to the health persuasion context. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Improving visibility of rear surface cracks during inductive thermography of metal plates using Autoencoder

    NASA Astrophysics Data System (ADS)

    Xie, Jing; Xu, Changhang; Chen, Guoming; Huang, Weiping

    2018-06-01

    Inductive thermography is one kind of infrared thermography (IRT) technique, which is effective in detection of front surface cracks in metal plates. However, rear surface cracks are usually missed due to their weak indications during inductive thermography. Here we propose a novel approach (AET: AE Thermography) to improve the visibility of rear surface cracks during inductive thermography by employing the Autoencoder (AE) algorithm, which is an important block to construct deep learning architectures. We construct an integrated framework for processing the raw inspection data of inductive thermography using the AE algorithm. Through this framework, underlying features of rear surface cracks are efficiently extracted and new clearer images are constructed. Experiments of inductive thermography were conducted on steel specimens to verify the efficacy of the proposed approach. We visually compare the raw thermograms, the empirical orthogonal functions (EOFs) of the prominent component thermography (PCT) technique and the results of AET. We further quantitatively evaluated AET by calculating crack contrast and signal-to-noise ratio (SNR). The results demonstrate that the proposed AET approach can remarkably improve the visibility of rear surface cracks and then improve the capability of inductive thermography in detecting rear surface cracks in metal plates.

  18. A framework for characterizing eHealth literacy demands and barriers.

    PubMed

    Chan, Connie V; Kaufman, David R

    2011-11-17

    Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.

  19. Tensor scale-based fuzzy connectedness image segmentation

    NASA Astrophysics Data System (ADS)

    Saha, Punam K.; Udupa, Jayaram K.

    2003-05-01

    Tangible solutions to image segmentation are vital in many medical imaging applications. Toward this goal, a framework based on fuzzy connectedness was developed in our laboratory. A fundamental notion called "affinity" - a local fuzzy hanging togetherness relation on voxels - determines the effectiveness of this segmentation framework in real applications. In this paper, we introduce the notion of "tensor scale" - a recently developed local morphometric parameter - in affinity definition and study its effectiveness. Although, our previous notion of "local scale" using the spherical model successfully incorporated local structure size into affinity and resulted in measureable improvements in segmentation results, a major limitation of the previous approach was that it ignored local structural orientation and anisotropy. The current approach of using tensor scale in affinity computation allows an effective utilization of local size, orientation, and ansiotropy in a unified manner. Tensor scale is used for computing both the homogeneity- and object-feature-based components of affinity. Preliminary results of the proposed method on several medical images and computer generated phantoms of realistic shapes are presented. Further extensions of this work are discussed.

  20. A unified framework for evaluating the risk of re-identification of text de-identification tools.

    PubMed

    Scaiano, Martin; Middleton, Grant; Arbuckle, Luk; Kolhatkar, Varada; Peyton, Liam; Dowling, Moira; Gipson, Debbie S; El Emam, Khaled

    2016-10-01

    It has become regular practice to de-identify unstructured medical text for use in research using automatic methods, the goal of which is to remove patient identifying information to minimize re-identification risk. The metrics commonly used to determine if these systems are performing well do not accurately reflect the risk of a patient being re-identified. We therefore developed a framework for measuring the risk of re-identification associated with textual data releases. We apply the proposed evaluation framework to a data set from the University of Michigan Medical School. Our risk assessment results are then compared with those that would be obtained using a typical contemporary micro-average evaluation of recall in order to illustrate the difference between the proposed evaluation framework and the current baseline method. We demonstrate how this framework compares against common measures of the re-identification risk associated with an automated text de-identification process. For the probability of re-identification using our evaluation framework we obtained a mean value for direct identifiers of 0.0074 and a mean value for quasi-identifiers of 0.0022. The 95% confidence interval for these estimates were below the relevant thresholds. The threshold for direct identifier risk was based on previously used approaches in the literature. The threshold for quasi-identifiers was determined based on the context of the data release following commonly used de-identification criteria for structured data. Our framework attempts to correct for poorly distributed evaluation corpora, accounts for the data release context, and avoids the often optimistic assumptions that are made using the more traditional evaluation approach. It therefore provides a more realistic estimate of the true probability of re-identification. This framework should be used as a basis for computing re-identification risk in order to more realistically evaluate future text de-identification tools. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Disseminating research findings: what should researchers do? A systematic scoping review of conceptual frameworks

    PubMed Central

    2010-01-01

    Background Addressing deficiencies in the dissemination and transfer of research-based knowledge into routine clinical practice is high on the policy agenda both in the UK and internationally. However, there is lack of clarity between funding agencies as to what represents dissemination. Moreover, the expectations and guidance provided to researchers vary from one agency to another. Against this background, we performed a systematic scoping to identify and describe any conceptual/organising frameworks that could be used by researchers to guide their dissemination activity. Methods We searched twelve electronic databases (including MEDLINE, EMBASE, CINAHL, and PsycINFO), the reference lists of included studies and of individual funding agency websites to identify potential studies for inclusion. To be included, papers had to present an explicit framework or plan either designed for use by researchers or that could be used to guide dissemination activity. Papers which mentioned dissemination (but did not provide any detail) in the context of a wider knowledge translation framework, were excluded. References were screened independently by at least two reviewers; disagreements were resolved by discussion. For each included paper, the source, the date of publication, a description of the main elements of the framework, and whether there was any implicit/explicit reference to theory were extracted. A narrative synthesis was undertaken. Results Thirty-three frameworks met our inclusion criteria, 20 of which were designed to be used by researchers to guide their dissemination activities. Twenty-eight included frameworks were underpinned at least in part by one or more of three different theoretical approaches, namely persuasive communication, diffusion of innovations theory, and social marketing. Conclusions There are currently a number of theoretically-informed frameworks available to researchers that can be used to help guide their dissemination planning and activity. Given the current emphasis on enhancing the uptake of knowledge about the effects of interventions into routine practice, funders could consider encouraging researchers to adopt a theoretically-informed approach to their research dissemination. PMID:21092164

  2. Design of CIAO, a research program to support the development of an integrated approach to prevent overweight and obesity in the Netherlands.

    PubMed

    van Koperen, Marije Tm; van der Kleij, Rianne Mjj; Renders, Carry Cm; Crone, Matty Mr; Hendriks, Anna-Marie Am; Jansen, Maria M; van de Gaar, Vivian Vm; Raat, Hein Jh; Ruiter, Emilie Elm; Molleman, Gerard Grm; Schuit, Jantine Aj; Seidell, Jacob Jc

    2014-01-01

    The aim of this paper is to describe the research aims, concepts and methods of the research Consortium Integrated Approach of Overweight (CIAO). CIAO is a concerted action of five Academic Collaborative Centres, local collaborations between academic institutions, regional public health services, local authorities and other relevant sectors in the Netherlands. Prior research revealed lacunas in knowledge of and skills related to five elements of the integrated approach of overweight prevention in children (based upon the French EPODE approach), namely political support, parental education, implementation, social marketing and evaluation. CIAO aims to gain theoretical and practical insight of these elements through five sub-studies and to develop, based on these data, a framework for monitoring and evaluation. For this research program, mixed methods are used in all the five sub-studies. First, problem specification through literature research and consultation of stakeholders, experts, health promotion specialists, parents and policy makers will be carried out. Based on this information, models, theoretical frameworks and practical instruments will be developed, tested and evaluated in the communities that implement the integrated approach to prevent overweight in children. Knowledge obtained from these studies and insights from experts and stakeholders will be combined to create an evaluation framework to evaluate the integrated approach at central, local and individual levels that will be applicable to daily practice. This innovative research program stimulates sub-studies to collaborate with local stakeholders and to share and integrate their knowledge, methodology and results. Therefore, the output of this program (both knowledge and practical tools) will be matched and form building blocks of a blueprint for a local evidence- and practice-based integrated approach towards prevention of overweight in children. The output will then support various communities to further optimize the implementation and subsequently the effects of this approach.

  3. Pedestrian detection from thermal images: A sparse representation based approach

    NASA Astrophysics Data System (ADS)

    Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi

    2016-05-01

    Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.

  4. Phylogenomic Insights into Mouse Evolution Using a Pseudoreference Approach

    PubMed Central

    Sarver, Brice A.J.; Keeble, Sara; Cosart, Ted; Tucker, Priscilla K.; Dean, Matthew D.

    2017-01-01

    Comparative genomic studies are now possible across a broad range of evolutionary timescales, but the generation and analysis of genomic data across many different species still present a number of challenges. The most sophisticated genotyping and down-stream analytical frameworks are still predominantly based on comparisons to high-quality reference genomes. However, established genomic resources are often limited within a given group of species, necessitating comparisons to divergent reference genomes that could restrict or bias comparisons across a phylogenetic sample. Here, we develop a scalable pseudoreference approach to iteratively incorporate sample-specific variation into a genome reference and reduce the effects of systematic mapping bias in downstream analyses. To characterize this framework, we used targeted capture to sequence whole exomes (∼54 Mbp) in 12 lineages (ten species) of mice spanning the Mus radiation. We generated whole exome pseudoreferences for all species and show that this iterative reference-based approach improved basic genomic analyses that depend on mapping accuracy while preserving the associated annotations of the mouse reference genome. We then use these pseudoreferences to resolve evolutionary relationships among these lineages while accounting for phylogenetic discordance across the genome, contributing an important resource for comparative studies in the mouse system. We also describe patterns of genomic introgression among lineages and compare our results to previous studies. Our general approach can be applied to whole or partitioned genomic data and is easily portable to any system with sufficient genomic resources, providing a useful framework for phylogenomic studies in mice and other taxa. PMID:28338821

  5. An Enhanced Text-Mining Framework for Extracting Disaster Relevant Data through Social Media and Remote Sensing Data Fusion

    NASA Astrophysics Data System (ADS)

    Scheele, C. J.; Huang, Q.

    2016-12-01

    In the past decade, the rise in social media has led to the development of a vast number of social media services and applications. Disaster management represents one of such applications leveraging massive data generated for event detection, response, and recovery. In order to find disaster relevant social media data, current approaches utilize natural language processing (NLP) methods based on keywords, or machine learning algorithms relying on text only. However, these approaches cannot be perfectly accurate due to the variability and uncertainty in language used on social media. To improve current methods, the enhanced text-mining framework is proposed to incorporate location information from social media and authoritative remote sensing datasets for detecting disaster relevant social media posts, which are determined by assessing the textual content using common text mining methods and how the post relates spatiotemporally to the disaster event. To assess the framework, geo-tagged Tweets were collected for three different spatial and temporal disaster events: hurricane, flood, and tornado. Remote sensing data and products for each event were then collected using RealEarthTM. Both Naive Bayes and Logistic Regression classifiers were used to compare the accuracy within the enhanced text-mining framework. Finally, the accuracies from the enhanced text-mining framework were compared to the current text-only methods for each of the case study disaster events. The results from this study address the need for more authoritative data when using social media in disaster management applications.

  6. A theoretical framework for the associations between identity and psychopathology.

    PubMed

    Klimstra, Theo A; Denissen, Jaap J A

    2017-11-01

    Identity research largely emerged from clinical observations. Decades of empirical work advanced the field in refining existing approaches and adding new approaches. Furthermore, the existence of linkages of identity with psychopathology is now well established. Unfortunately, both the directionality of effects between identity aspects and psychopathology symptoms, and the mechanisms underlying associations are unclear. In the present paper, we present a new framework to inspire hypothesis-driven empirical research to overcome this limitation. The framework has a basic resemblance to theoretical models for the study of personality and psychopathology, so we provide examples of how these might apply to the study of identity. Next, we explain that unique features of identity may come into play in individuals suffering from psychopathology that are mostly related to the content of one's identity. These include pros and cons of identifying with one's diagnostic label. Finally, inspired by Hermans' dialogical self theory and principles derived from Piaget's, Swann's and Kelly's work, we delineate a framework with identity at the core of an individual multidimensional space. In this space, psychopathology symptoms have a known distance (representing relevance) to one's identity, and individual multidimensional spaces are connected to those of other individuals in one's social network. We discuss methodological (quantitative and qualitative, idiographic and nomothetic) and statistical procedures (multilevel models and network models) to test the framework. Resulting evidence can boost the field of identity research in demonstrating its high practical relevance for the emergence and conservation of psychopathology. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Two controller design approaches for decentralized systems

    NASA Technical Reports Server (NTRS)

    Ozguner, U.; Khorrami, F.; Iftar, A.

    1988-01-01

    Two different philosophies for designing the controllers of decentralized systems are considered within a quadratic regulator framework which is generalized to admit decentralized frequency weighting. In the first approach, the total system model is examined, and the feedback strategy for each channel or subsystem is determined. In the second approach, separate, possibly overlapping, and uncoupled models are analyzed for each channel, and the results can be combined to study the original system. The two methods are applied to the example of a model of the NASA COFS Mast Flight System.

  8. A Unified Estimation Framework for State-Related Changes in Effective Brain Connectivity.

    PubMed

    Samdin, S Balqis; Ting, Chee-Ming; Ombao, Hernando; Salleh, Sh-Hussain

    2017-04-01

    This paper addresses the critical problem of estimating time-evolving effective brain connectivity. Current approaches based on sliding window analysis or time-varying coefficient models do not simultaneously capture both slow and abrupt changes in causal interactions between different brain regions. To overcome these limitations, we develop a unified framework based on a switching vector autoregressive (SVAR) model. Here, the dynamic connectivity regimes are uniquely characterized by distinct vector autoregressive (VAR) processes and allowed to switch between quasi-stationary brain states. The state evolution and the associated directed dependencies are defined by a Markov process and the SVAR parameters. We develop a three-stage estimation algorithm for the SVAR model: 1) feature extraction using time-varying VAR (TV-VAR) coefficients, 2) preliminary regime identification via clustering of the TV-VAR coefficients, 3) refined regime segmentation by Kalman smoothing and parameter estimation via expectation-maximization algorithm under a state-space formulation, using initial estimates from the previous two stages. The proposed framework is adaptive to state-related changes and gives reliable estimates of effective connectivity. Simulation results show that our method provides accurate regime change-point detection and connectivity estimates. In real applications to brain signals, the approach was able to capture directed connectivity state changes in functional magnetic resonance imaging data linked with changes in stimulus conditions, and in epileptic electroencephalograms, differentiating ictal from nonictal periods. The proposed framework accurately identifies state-dependent changes in brain network and provides estimates of connectivity strength and directionality. The proposed approach is useful in neuroscience studies that investigate the dynamics of underlying brain states.

  9. An Agent-Based Optimization Framework for Engineered Complex Adaptive Systems with Application to Demand Response in Electricity Markets

    NASA Astrophysics Data System (ADS)

    Haghnevis, Moeed

    The main objective of this research is to develop an integrated method to study emergent behavior and consequences of evolution and adaptation in engineered complex adaptive systems (ECASs). A multi-layer conceptual framework and modeling approach including behavioral and structural aspects is provided to describe the structure of a class of engineered complex systems and predict their future adaptive patterns. The approach allows the examination of complexity in the structure and the behavior of components as a result of their connections and in relation to their environment. This research describes and uses the major differences of natural complex adaptive systems (CASs) with artificial/engineered CASs to build a framework and platform for ECAS. While this framework focuses on the critical factors of an engineered system, it also enables one to synthetically employ engineering and mathematical models to analyze and measure complexity in such systems. In this way concepts of complex systems science are adapted to management science and system of systems engineering. In particular an integrated consumer-based optimization and agent-based modeling (ABM) platform is presented that enables managers to predict and partially control patterns of behaviors in ECASs. Demonstrated on the U.S. electricity markets, ABM is integrated with normative and subjective decision behavior recommended by the U.S. Department of Energy (DOE) and Federal Energy Regulatory Commission (FERC). The approach integrates social networks, social science, complexity theory, and diffusion theory. Furthermore, it has unique and significant contribution in exploring and representing concrete managerial insights for ECASs and offering new optimized actions and modeling paradigms in agent-based simulation.

  10. Implementing vertex dynamics models of cell populations in biology within a consistent computational framework.

    PubMed

    Fletcher, Alexander G; Osborne, James M; Maini, Philip K; Gavaghan, David J

    2013-11-01

    The dynamic behaviour of epithelial cell sheets plays a central role during development, growth, disease and wound healing. These processes occur as a result of cell adhesion, migration, division, differentiation and death, and involve multiple processes acting at the cellular and molecular level. Computational models offer a useful means by which to investigate and test hypotheses about these processes, and have played a key role in the study of cell-cell interactions. However, the necessarily complex nature of such models means that it is difficult to make accurate comparison between different models, since it is often impossible to distinguish between differences in behaviour that are due to the underlying model assumptions, and those due to differences in the in silico implementation of the model. In this work, an approach is described for the implementation of vertex dynamics models, a discrete approach that represents each cell by a polygon (or polyhedron) whose vertices may move in response to forces. The implementation is undertaken in a consistent manner within a single open source computational framework, Chaste, which comprises fully tested, industrial-grade software that has been developed using an agile approach. This framework allows one to easily change assumptions regarding force generation and cell rearrangement processes within these models. The versatility and generality of this framework is illustrated using a number of biological examples. In each case we provide full details of all technical aspects of our model implementations, and in some cases provide extensions to make the models more generally applicable. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Developing a curriculum framework for global health in family medicine: emerging principles, competencies, and educational approaches.

    PubMed

    Redwood-Campbell, Lynda; Pakes, Barry; Rouleau, Katherine; MacDonald, Colla J; Arya, Neil; Purkey, Eva; Schultz, Karen; Dhatt, Reena; Wilson, Briana; Hadi, Abdullahel; Pottie, Kevin

    2011-07-22

    Recognizing the growing demand from medical students and residents for more comprehensive global health training, and the paucity of explicit curricula on such issues, global health and curriculum experts from the six Ontario Family Medicine Residency Programs worked together to design a framework for global health curricula in family medicine training programs. A working group comprised of global health educators from Ontario's six medical schools conducted a scoping review of global health curricula, competencies, and pedagogical approaches. The working group then hosted a full day meeting, inviting experts in education, clinical care, family medicine and public health, and developed a consensus process and draft framework to design global health curricula. Through a series of weekly teleconferences over the next six months, the framework was revised and used to guide the identification of enabling global health competencies (behaviours, skills and attitudes) for Canadian Family Medicine training. The main outcome was an evidence-informed interactive framework http://globalhealth.ennovativesolution.com/ to provide a shared foundation to guide the design, delivery and evaluation of global health education programs for Ontario's family medicine residency programs. The curriculum framework blended a definition and mission for global health training, core values and principles, global health competencies aligning with the Canadian Medical Education Directives for Specialists (CanMEDS) competencies, and key learning approaches. The framework guided the development of subsequent enabling competencies. The shared curriculum framework can support the design, delivery and evaluation of global health curriculum in Canada and around the world, lay the foundation for research and development, provide consistency across programmes, and support the creation of learning and evaluation tools to align with the framework. The process used to develop this framework can be applied to other aspects of residency curriculum development.

  12. Catchment Classification: Connecting Climate, Structure and Function

    NASA Astrophysics Data System (ADS)

    Sawicz, K. A.; Wagener, T.; Sivapalan, M.; Troch, P. A.; Carrillo, G. A.

    2010-12-01

    Hydrology does not yet possess a generally accepted catchment classification framework. Such a classification framework needs to: [1] give names to things, i.e. the main classification step, [2] permit transfer of information, i.e. regionalization of information, [3] permit development of generalizations, i.e. to develop new theory, and [4] provide a first order environmental change impact assessment, i.e., the hydrologic implications of climate, land use and land cover change. One strategy is to create a catchment classification framework based on the notion of catchment functions (partitioning, storage, and release). Results of an empirical study presented here connects climate and structure to catchment function (in the form of select hydrologic signatures), based on analyzing over 300 US catchments. Initial results indicate a wide assortment of signature relationships with properties of climate, geology, and vegetation. The uncertainty in the different regionalized signatures varies widely, and therefore there is variability in the robustness of classifying ungauged basins. This research provides insight into the controls of hydrologic behavior of a catchment, and enables a classification framework applicable to gauged and ungauged across the study domain. This study sheds light on what we can expect to achieve in mapping climate, structure and function in a top-down manner. Results of this study complement work done using a bottom-up physically-based modeling framework to generalize this approach (Carrillo et al., this session).

  13. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  14. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation.

    PubMed

    Maluka, Stephen; Kamuzora, Peter; Sansebastián, Miguel; Byskov, Jens; Ndawi, Benedict; Olsen, Øystein E; Hurtig, Anna-Karin

    2011-02-10

    Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full implementation. This study documents an important first step in the effort to introduce the ethical framework A4R into district planning processes. This study supports the idea that a greater involvement and accountability among local actors through the A4R process may increase the legitimacy and fairness of priority-setting decisions. Support from researchers in providing a broader and more detailed analysis of health system elements, and the socio-cultural context, could lead to better prediction of the effects of the innovation and pinpoint stakeholders' concerns, thereby illuminating areas that require special attention to promote sustainability.

  15. A data fusion framework for meta-evaluation of intelligent transportation system effectiveness

    DOT National Transportation Integrated Search

    This study presents a framework for the meta-evaluation of Intelligent Transportation System effectiveness. The framework is based on data fusion approaches that adjust for data biases and violations of other standard statistical assumptions. Operati...

  16. Beyond PARR - PMEL's Integrated Data Management Strategy

    NASA Astrophysics Data System (ADS)

    Burger, E. F.; O'Brien, K.; Manke, A. B.; Schweitzer, R.; Smith, K. M.

    2016-12-01

    NOAA's Pacific Marine Environmental Laboratory (PMEL) hosts a wide range of scientific projects that span a number of scientific and environmental research disciplines. Each of these 14 research projects have their own data streams that are as diverse as the research. With its requirements for public access to federally funded research results and data, the 2013 White House Office of Science and Technology memo on Public Access to Research Results (PARR) changed the data management landscape for Federal agencies. In 2015, with support from the PMEL Director, Dr. Christopher Sabine, PMEL's Science Data Integration Group (SDIG) initiated a multi-year effort to formulate and implement an integrated data-management strategy for PMEL research efforts. Instead of using external requirements, such as PARR, to define our approach, we focussed on strategies to provide PMEL science projects with a unified framework for data submission, interoperable data access, data storage, and easier data archival to National Data Centers. This improves data access to PMEL scientists, their collaborators, and the public, and also provides a unified lab framework that allows our projects to meet their data management objectives, as well as those required by the PARR. We are implementing this solution in stages that allows us to test technology and architecture choices before comitting to a large scale implementation. SDIG developers have completed the first year of development where our approach is to reuse and leverage existing frameworks and standards. This presentation will describe our data management strategy, explain our phased implementation approach, the software and framework choices, and how these elements help us meet the objectives of this strategy. We will share the lessons learned in dealing with diverse and complex datasets in this first year of implementation and how these outcomes will shape our decisions for this ongoing effort. The data management capabilities now available to scientific projects, and other services being developed to manage and preserve PMEL's scientific data assets for our researchers, their collaborators, and future generations, will be described.

  17. Obesity Policy Action framework and analysis grids for a comprehensive policy approach to reducing obesity.

    PubMed

    Sacks, G; Swinburn, B; Lawrence, M

    2009-01-01

    A comprehensive policy approach is needed to control the growing obesity epidemic. This paper proposes the Obesity Policy Action (OPA) framework, modified from the World Health Organization framework for the implementation of the Global Strategy on Diet, Physical Activity and Health, to provide specific guidance for governments to systematically identify areas for obesity policy action. The proposed framework incorporates three different public health approaches to addressing obesity: (i) 'upstream' policies influence either the broad social and economic conditions of society (e.g. taxation, education, social security) or the food and physical activity environments to make healthy eating and physical activity choices easier; (ii) 'midstream' policies are aimed at directly influencing population behaviours; and (iii) 'downstream' policies support health services and clinical interventions. A set of grids for analysing potential policies to support obesity prevention and management is presented. The general pattern that emerges from populating the analysis grids as they relate to the Australian context is that all sectors and levels of government, non-governmental organizations and private businesses have multiple opportunities to contribute to reducing obesity. The proposed framework and analysis grids provide a comprehensive approach to mapping the policy environment related to obesity, and a tool for identifying policy gaps, barriers and opportunities.

  18. Primary Care Practice Transformation Is Hard Work

    PubMed Central

    Crabtree, Benjamin F.; Nutting, Paul A.; Miller, William L.; McDaniel, Reuben R.; Stange, Kurt C.; Jaén, Carlos Roberto; Stewart, Elizabeth

    2010-01-01

    Background Serious shortcomings remain in clinical care in the United States despite widespread use of improvement strategies for enhancing clinical performance based on knowledge transfer approaches. Recent calls to transform primary care practice to a patient-centered medical home present even greater challenges and require more effective approaches. Methods Our research team conducted a series of National Institutes of Health funded descriptive and intervention projects to understand organizational change in primary care practice settings, emphasizing a complexity science perspective. The result was a developmental research effort that enabled the identification of critical lessons relevant to enabling practice change. Results A summary of findings from a 15-year program of research highlights the limitations of viewing primary care practices in the mechanistic terms that underlie current or traditional approaches to quality improvement. A theoretical perspective that views primary care practices as dynamic complex adaptive systems with “agents” who have the capacity to learn, and the freedom to act in unpredictable ways provides a better framework for grounding quality improvement strategies. This framework strongly emphasizes that quality improvement interventions should not only use a complexity systems perspective, but also there is a need for continual reflection, careful tailoring of interventions, and ongoing attention to the quality of interactions among agents in the practice. Conclusions It is unlikely that current strategies for quality improvement will be successful in transforming current primary care practice to a patient-centered medical home without a stronger guiding theoretical foundation. Our work suggests that a theoretical framework guided by complexity science can help in the development of quality improvement strategies that will more effectively facilitate practice change. PMID:20856145

  19. Redressing the Epidemics of Opioid Overdose and HIV among People who Inject Drugs in Central Asia: The Need for a Syndemic Approach

    PubMed Central

    Gilbert, Louisa; Primbetova, Sholpan; Nikitin, Danil; Hunt, Timothy; Terlikbayeva, Assel; Momenghalibaf, Azzi; Ruziev, Murodali; El-Bassel, Nabila

    2013-01-01

    Background Accumulating evidence suggests that opioid overdose and HIV infection are burgeoning intertwined epidemics among people who inject drugs (PWID) in Central Asia. To date, however, research on overdose and its associations with HIV risks among PWID in Central Asia remains virtually absent. This paper aims to provide a regional overview of the hidden epidemic of overdose and how it is linked to HIV among PWID in Central Asia, using a syndemic framework that is guided by risk environment research. Methods We conducted a comprehensive literature search of peer-reviewed publications and grey literature on opioid overdose and its associations with HIV in five countries of Central Asia (Kazakhstan, Kyrgyzstan, Tajikistan, Turkmenistan and Uzbekistan) as well as on policies and programs that address these co-occurring epidemics. Results Regional data indicate high rates of fatal and non-fatal overdose among PWID. Evidence suggests mortality rates from overdose exceed HIV/AIDS as the leading cause of death among PWID. The syndemic framework suggests multiple macro-level and micro-level environmental risk factors that drive the co-occurring epidemics of HIV and overdose. This framework identifies several interacting biological and behavioral risks that result in additive effects for HIV and overdose. Conclusion The high rates of overdose and its associations with HIV underscore the need for a syndemic approach that considers overdose on parity with HIV. Such an approach should focus on the biological, behavioral and structural interactions between these epidemics to reduce social suffering, morbidity and mortality among PWID in Central Asia. PMID:23954070

  20. A framework for biodynamic feedthrough analysis--part I: theoretical foundations.

    PubMed

    Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

    2014-09-01

    Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.

  1. Setting conservation priorities.

    PubMed

    Wilson, Kerrie A; Carwardine, Josie; Possingham, Hugh P

    2009-04-01

    A generic framework for setting conservation priorities based on the principles of classic decision theory is provided. This framework encapsulates the key elements of any problem, including the objective, the constraints, and knowledge of the system. Within the context of this framework the broad array of approaches for setting conservation priorities are reviewed. While some approaches prioritize assets or locations for conservation investment, it is concluded here that prioritization is incomplete without consideration of the conservation actions required to conserve the assets at particular locations. The challenges associated with prioritizing investments through time in the face of threats (and also spatially and temporally heterogeneous costs) can be aided by proper problem definition. Using the authors' general framework for setting conservation priorities, multiple criteria can be rationally integrated and where, how, and when to invest conservation resources can be scheduled. Trade-offs are unavoidable in priority setting when there are multiple considerations, and budgets are almost always finite. The authors discuss how trade-offs, risks, uncertainty, feedbacks, and learning can be explicitly evaluated within their generic framework for setting conservation priorities. Finally, they suggest ways that current priority-setting approaches may be improved.

  2. Thermostating extended Lagrangian Born-Oppenheimer molecular dynamics.

    PubMed

    Martínez, Enrique; Cawkwell, Marc J; Voter, Arthur F; Niklasson, Anders M N

    2015-04-21

    Extended Lagrangian Born-Oppenheimer molecular dynamics is developed and analyzed for applications in canonical (NVT) simulations. Three different approaches are considered: the Nosé and Andersen thermostats and Langevin dynamics. We have tested the temperature distribution under different conditions of self-consistent field (SCF) convergence and time step and compared the results to analytical predictions. We find that the simulations based on the extended Lagrangian Born-Oppenheimer framework provide accurate canonical distributions even under approximate SCF convergence, often requiring only a single diagonalization per time step, whereas regular Born-Oppenheimer formulations exhibit unphysical fluctuations unless a sufficiently high degree of convergence is reached at each time step. The thermostated extended Lagrangian framework thus offers an accurate approach to sample processes in the canonical ensemble at a fraction of the computational cost of regular Born-Oppenheimer molecular dynamics simulations.

  3. Analysing task design and students' responses to context-based problems through different analytical frameworks

    NASA Astrophysics Data System (ADS)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been found successful to analyse both the test items as well as students' responses in a systematic way. The framework can therefore be applied in the design of new tasks, the analysis and assessment of students' responses, and as a tool for teachers to scaffold students in their problem-solving process. Conclusions:This paper gives implications for practice and for future research to both develop new context-based problems in a structured way, as well as providing analytical tools for investigating students' higher order thinking in their responses to these tasks.

  4. Trajectory-Oriented Approach to Managing Traffic Complexity: Operational Concept and Preliminary Metrics Definition

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Vivona, Robert; Garcia-Chico, Jose L.

    2008-01-01

    This document describes preliminary research on a distributed, trajectory-oriented approach for traffic complexity management. The approach is to manage traffic complexity in a distributed control environment, based on preserving trajectory flexibility and minimizing constraints. In particular, the document presents an analytical framework to study trajectory flexibility and the impact of trajectory constraints on it. The document proposes preliminary flexibility metrics that can be interpreted and measured within the framework.

  5. A clinically driven variant prioritization framework outperforms purely computational approaches for the diagnostic analysis of singleton WES data.

    PubMed

    Stark, Zornitza; Dashnow, Harriet; Lunke, Sebastian; Tan, Tiong Y; Yeung, Alison; Sadedin, Simon; Thorne, Natalie; Macciocca, Ivan; Gaff, Clara; Oshlack, Alicia; White, Susan M; James, Paul A

    2017-11-01

    Rapid identification of clinically significant variants is key to the successful application of next generation sequencing technologies in clinical practice. The Melbourne Genomics Health Alliance (MGHA) variant prioritization framework employs a gene prioritization index based on clinician-generated a priori gene lists, and a variant prioritization index (VPI) based on rarity, conservation and protein effect. We used data from 80 patients who underwent singleton whole exome sequencing (WES) to test the ability of the framework to rank causative variants highly, and compared it against the performance of other gene and variant prioritization tools. Causative variants were identified in 59 of the patients. Using the MGHA prioritization framework the average rank of the causative variant was 2.24, with 76% ranked as the top priority variant, and 90% ranked within the top five. Using clinician-generated gene lists resulted in ranking causative variants an average of 8.2 positions higher than prioritization based on variant properties alone. This clinically driven prioritization approach significantly outperformed purely computational tools, placing a greater proportion of causative variants top or in the top 5 (permutation P-value=0.001). Clinicians included 40 of the 49 WES diagnoses in their a priori list of differential diagnoses (81%). The lists generated by PhenoTips and Phenomizer contained 14 (29%) and 18 (37%) of these diagnoses respectively. These results highlight the benefits of clinically led variant prioritization in increasing the efficiency of singleton WES data analysis and have important implications for developing models for the funding and delivery of genomic services.

  6. Simultaneous learning of instantaneous and time-delayed genetic interactions using novel information theoretic scoring technique

    PubMed Central

    2012-01-01

    Background Understanding gene interactions is a fundamental question in systems biology. Currently, modeling of gene regulations using the Bayesian Network (BN) formalism assumes that genes interact either instantaneously or with a certain amount of time delay. However in reality, biological regulations, both instantaneous and time-delayed, occur simultaneously. A framework that can detect and model both these two types of interactions simultaneously would represent gene regulatory networks more accurately. Results In this paper, we introduce a framework based on the Bayesian Network (BN) formalism that can represent both instantaneous and time-delayed interactions between genes simultaneously. A novel scoring metric having firm mathematical underpinnings is also proposed that, unlike other recent methods, can score both interactions concurrently and takes into account the reality that multiple regulators can regulate a gene jointly, rather than in an isolated pair-wise manner. Further, a gene regulatory network (GRN) inference method employing an evolutionary search that makes use of the framework and the scoring metric is also presented. Conclusion By taking into consideration the biological fact that both instantaneous and time-delayed regulations can occur among genes, our approach models gene interactions with greater accuracy. The proposed framework is efficient and can be used to infer gene networks having multiple orders of instantaneous and time-delayed regulations simultaneously. Experiments are carried out using three different synthetic networks (with three different mechanisms for generating synthetic data) as well as real life networks of Saccharomyces cerevisiae, E. coli and cyanobacteria gene expression data. The results show the effectiveness of our approach. PMID:22691450

  7. Integrated Communication in Multinational Coalition Operations Within a Comprehensive Approach. Framework Concept

    DTIC Science & Technology

    2010-10-22

    international crisis management operations. Based on the acknowledgment that coalitions are challenged with achieving both cohesive and coherent...conceptual framework for integrating communication in international crisis management operations. Based on the acknowledgment that coalitions are...CHALLENGE: COHESION AND COHERENCE..................................................... 31 3.3 A MANAGEMENT AND CLIENT-CENTRED APPROACH TO

  8. Food Practices and School Connectedness: A Whole-School Approach

    ERIC Educational Resources Information Center

    Neely, Eva; Walton, Mat; Stephens, Christine

    2016-01-01

    Purpose: The health-promoting schools (HPSs) framework has emerged as a promising model for promoting school connectedness in the school setting. The purpose of this paper is to explore the potential for food practices to promote school connectedness within a HPSs framework. Design/methodology/approach: This study explores food practices within a…

  9. The Strategic Evaluation of Regional Development in Higher Education

    ERIC Educational Resources Information Center

    Kettunen, Juha

    2004-01-01

    The study analyses the role of regional development in higher education using the approach of the balanced scorecard, which provides a framework for organizations to describe and communicate their strategy. It turns out that the balanced scorecard is not only an approach for implementing the strategy, but it also provides a general framework for…

  10. Making a Good Group Decision (Low Risk) in Singapore Under an Environment That Has Time and Cost Constraints

    DTIC Science & Technology

    2014-09-01

    decision-making framework to eliminate bias and promote effective communication. Using a collaborative approach built on systems engineering and...framework to eliminate bias and promote effective communication. Using a collaborative approach built on systems engineering and decision-making...Organization .......................................................................................61 2. Bias

  11. Developing and Managing University-Industry Research Collaborations through a Process Methodology/Industrial Sector Approach

    ERIC Educational Resources Information Center

    Philbin, Simon P.

    2010-01-01

    A management framework has been successfully utilized at Imperial College London in the United Kingdom to improve the process for developing and managing university-industry research collaborations. The framework has been part of a systematic approach to increase the level of research contracts from industrial sources, to strengthen the…

  12. A Framework for Understanding Young Children with Severe Multiple Disabilities: The van Dijk Approach to Assessment.

    ERIC Educational Resources Information Center

    Nelson, Catherine; van Dijk, Jan; McDonnell, Andrea P.; Thompson, Kristina

    2002-01-01

    This article describes a framework for assessing young children with severe multiple disabilities. The assessment is child-led and examines underlying processes of learning, including biobehavioral state, orienting response, learning channels, approach-withdrawal, memory, interactions, communication, and problem solving. Case studies and a sample…

  13. The Appeal of Soap Opera.

    ERIC Educational Resources Information Center

    Kielwasser, Alfred P.; Wolf, Michelle A.

    This paper provides a framework for developing an approach to understanding soap opera's appeal as a direct function of both the genre's form and of its fans' viewing behavior. The paper suggests that while this analysis is largely critical, other studies from both critical and social scientific approaches can be based upon the framework and…

  14. Action Research Study. A Framework To Help Move Teachers toward an Inquiry-Based Science Teaching Approach.

    ERIC Educational Resources Information Center

    Staten, Mary E.

    This action research study developed a framework for moving teachers toward an inquiry-based approach to teaching science, emphasizing elements, strategies, and supports necessary to encourage and sustain teachers' use of inquiry-based science instruction. The study involved a literature review, participant observation, focus group discussions,…

  15. A Manual to Identify Sources of Fluvial Sediment | Science ...

    EPA Pesticide Factsheets

    Sedimentation is one of the main causes of stream/river aquatic life use impairments in R3. Currently states lack standard guidance on appropriate tools available to quantify sediment sources and develop sediment budgets in TMDL Development. Methods for distinguishing sediment types for TMDL development will focus stream restoration and soil conservation efforts in strategic locations in a watershed and may better target appropriate BMPs to achieve sediment load reductions. Properly identifying sediment sources in a TMDL will also help focus NPDES permitting, stream restoration activities and other TMDL implementation efforts. This project will focus on developing a framework that will be published as a guidance document that outlines steps and approaches to identify the significant sources of fine-grained sediment in 303D listed watersheds. In this framework, the sediment-fingerprinting and sediment budget approaches will be emphasized. This project will focus on developing a framework that will be published as a guidance document that outlines steps and approaches to identify the significant sources of fine-grained sediment in 303D listed watersheds. In this framework, the sediment-fingerprinting and sediment budget approaches will be emphasized.

  16. Evidence-based decision making : developing a knowledge base for successful program outcomes in transportation asset management.

    DOT National Transportation Integrated Search

    2015-12-01

    MAP-21 and AASHTOs framework for transportation asset management (TAM) offer opportunities to use more : rigorous approaches to collect and apply evidence within a TAM context. This report documents the results of a study : funded by the Georgia D...

  17. Modeling Learning Processes in Lexical CALL.

    ERIC Educational Resources Information Center

    Goodfellow, Robin; Laurillard, Diana

    1994-01-01

    Studies the performance of a novice Spanish student using a Computer-assisted language learning (CALL) system designed for vocabulary enlargement. Results indicate that introspective evidence may be used to validate performance data within a theoretical framework that characterizes the learning approach as "surface" or "deep." (25 references)…

  18. Food parenting and children's dietary behaviours: Approaching an integrated theoretical framework

    USDA-ARS?s Scientific Manuscript database

    We explored the differential influences of parental feeding styles and food parenting practices on children's dietary intake. Simple knowledge-based parent change interventions have generally not been shown to influence children's dietary intake. As a result, increasing attention has been given to t...

  19. Flow Restoration in the Columbia River Basin: An Evaluation of a Flow Restoration Accounting Framework

    NASA Astrophysics Data System (ADS)

    McCoy, Amy L.; Holmes, S. Rankin; Boisjolie, Brett A.

    2018-03-01

    Securing environmental flows in support of freshwater biodiversity is an evolving field of practice. An example of a large-scale program dedicated to restoring environmental flows is the Columbia Basin Water Transactions Program in the Pacific Northwest region of North America, which has been restoring flows in dewatered tributary habitats for imperiled salmon species over the past decade. This paper discusses a four-tiered flow restoration accounting framework for tracking the implementation and impacts of water transactions as an effective tool for adaptive management. The flow restoration accounting framework provides compliance and flow accounting information to monitor transaction efficacy. We review the implementation of the flow restoration accounting framework monitoring framework to demonstrate (a) the extent of water transactions that have been implemented over the past decade, (b) the volumes of restored flow in meeting flow targets for restoring habitat for anadromous fish species, and (c) an example of aquatic habitat enhancement that resulted from Columbia Basin Water Transactions Program investments. Project results show that from 2002 to 2015, the Columbia Basin Water Transactions Program has completed more than 450 water rights transactions, restoring approximately 1.59 million megaliters to date, with an additional 10.98 million megaliters of flow protected for use over the next 100 years. This has resulted in the watering of over 2414 stream kilometers within the Columbia Basin. We conclude with a discussion of the insights gained through the implementation of the flow restoration accounting framework. Understanding the approach and efficacy of a monitoring framework applied across a large river basin can be informative to emerging flow-restoration and adaptive management efforts in areas of conservation concern.

  20. Flow Restoration in the Columbia River Basin: An Evaluation of a Flow Restoration Accounting Framework.

    PubMed

    McCoy, Amy L; Holmes, S Rankin; Boisjolie, Brett A

    2018-03-01

    Securing environmental flows in support of freshwater biodiversity is an evolving field of practice. An example of a large-scale program dedicated to restoring environmental flows is the Columbia Basin Water Transactions Program in the Pacific Northwest region of North America, which has been restoring flows in dewatered tributary habitats for imperiled salmon species over the past decade. This paper discusses a four-tiered flow restoration accounting framework for tracking the implementation and impacts of water transactions as an effective tool for adaptive management. The flow restoration accounting framework provides compliance and flow accounting information to monitor transaction efficacy. We review the implementation of the flow restoration accounting framework monitoring framework to demonstrate (a) the extent of water transactions that have been implemented over the past decade, (b) the volumes of restored flow in meeting flow targets for restoring habitat for anadromous fish species, and (c) an example of aquatic habitat enhancement that resulted from Columbia Basin Water Transactions Program investments. Project results show that from 2002 to 2015, the Columbia Basin Water Transactions Program has completed more than 450 water rights transactions, restoring approximately 1.59 million megaliters to date, with an additional 10.98 million megaliters of flow protected for use over the next 100 years. This has resulted in the watering of over 2414 stream kilometers within the Columbia Basin. We conclude with a discussion of the insights gained through the implementation of the flow restoration accounting framework. Understanding the approach and efficacy of a monitoring framework applied across a large river basin can be informative to emerging flow-restoration and adaptive management efforts in areas of conservation concern.

  1. Clinical data integration model. Core interoperability ontology for research using primary care data.

    PubMed

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.

  2. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.

  3. EDGAR: A software framework for the comparative analysis of prokaryotic genomes

    PubMed Central

    Blom, Jochen; Albaum, Stefan P; Doppmeier, Daniel; Pühler, Alfred; Vorhölter, Frank-Jörg; Zakrzewski, Martha; Goesmann, Alexander

    2009-01-01

    Background The introduction of next generation sequencing approaches has caused a rapid increase in the number of completely sequenced genomes. As one result of this development, it is now feasible to analyze large groups of related genomes in a comparative approach. A main task in comparative genomics is the identification of orthologous genes in different genomes and the classification of genes as core genes or singletons. Results To support these studies EDGAR – "Efficient Database framework for comparative Genome Analyses using BLAST score Ratios" – was developed. EDGAR is designed to automatically perform genome comparisons in a high throughput approach. Comparative analyses for 582 genomes across 75 genus groups taken from the NCBI genomes database were conducted with the software and the results were integrated into an underlying database. To demonstrate a specific application case, we analyzed ten genomes of the bacterial genus Xanthomonas, for which phylogenetic studies were awkward due to divergent taxonomic systems. The resultant phylogeny EDGAR provided was consistent with outcomes from traditional approaches performed recently and moreover, it was possible to root each strain with unprecedented accuracy. Conclusion EDGAR provides novel analysis features and significantly simplifies the comparative analysis of related genomes. The software supports a quick survey of evolutionary relationships and simplifies the process of obtaining new biological insights into the differential gene content of kindred genomes. Visualization features, like synteny plots or Venn diagrams, are offered to the scientific community through a web-based and therefore platform independent user interface , where the precomputed data sets can be browsed. PMID:19457249

  4. A similarity learning approach to content-based image retrieval: application to digital mammography.

    PubMed

    El-Naqa, Issam; Yang, Yongyi; Galatsanos, Nikolas P; Nishikawa, Robert M; Wernick, Miles N

    2004-10-01

    In this paper, we describe an approach to content-based retrieval of medical images from a database, and provide a preliminary demonstration of our approach as applied to retrieval of digital mammograms. Content-based image retrieval (CBIR) refers to the retrieval of images from a database using information derived from the images themselves, rather than solely from accompanying text indices. In the medical-imaging context, the ultimate aim of CBIR is to provide radiologists with a diagnostic aid in the form of a display of relevant past cases, along with proven pathology and other suitable information. CBIR may also be useful as a training tool for medical students and residents. The goal of information retrieval is to recall from a database information that is relevant to the user's query. The most challenging aspect of CBIR is the definition of relevance (similarity), which is used to guide the retrieval machine. In this paper, we pursue a new approach, in which similarity is learned from training examples provided by human observers. Specifically, we explore the use of neural networks and support vector machines to predict the user's notion of similarity. Within this framework we propose using a hierarchal learning approach, which consists of a cascade of a binary classifier and a regression module to optimize retrieval effectiveness and efficiency. We also explore how to incorporate online human interaction to achieve relevance feedback in this learning framework. Our experiments are based on a database consisting of 76 mammograms, all of which contain clustered microcalcifications (MCs). Our goal is to retrieve mammogram images containing similar MC clusters to that in a query. The performance of the retrieval system is evaluated using precision-recall curves computed using a cross-validation procedure. Our experimental results demonstrate that: 1) the learning framework can accurately predict the perceptual similarity reported by human observers, thereby serving as a basis for CBIR; 2) the learning-based framework can significantly outperform a simple distance-based similarity metric; 3) the use of the hierarchical two-stage network can improve retrieval performance; and 4) relevance feedback can be effectively incorporated into this learning framework to achieve improvement in retrieval precision based on online interaction with users; and 5) the retrieved images by the network can have predicting value for the disease condition of the query.

  5. Bioethics for clinicians: 15. Quality end-of-life care

    PubMed Central

    Singer, P A; MacDonald, N

    1998-01-01

    A physician who receives a call from the emergency department to see a patient with heart failure will have a clear framework within which to approach this problem. The thesis of this article is that physicians do not have an analogous conceptual framework for approaching end-of-life care. The authors present and describe a framework for end-of-life care with 3 main elements: control of pain and other symptoms, the use of life-sustaining treatments and support of those who are dying and their families. This 3-part framework can be used by clinicians at the bedside to focus their effort in improving the quality of end-of-life care. PMID:9700330

  6. Tailored and Integrated Web-Based Tools for Improving Psychosocial Outcomes of Cancer Patients: The DoTTI Development Framework

    PubMed Central

    Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-01-01

    Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991

  7. Volumetric image classification using homogeneous decomposition and dictionary learning: A study using retinal optical coherence tomography for detecting age-related macular degeneration.

    PubMed

    Albarrak, Abdulrahman; Coenen, Frans; Zheng, Yalin

    2017-01-01

    Three-dimensional (3D) (volumetric) diagnostic imaging techniques are indispensable with respect to the diagnosis and management of many medical conditions. However there is a lack of automated diagnosis techniques to facilitate such 3D image analysis (although some support tools do exist). This paper proposes a novel framework for volumetric medical image classification founded on homogeneous decomposition and dictionary learning. In the proposed framework each image (volume) is recursively decomposed until homogeneous regions are arrived at. Each region is represented using a Histogram of Oriented Gradients (HOG) which is transformed into a set of feature vectors. The Gaussian Mixture Model (GMM) is then used to generate a "dictionary" and the Improved Fisher Kernel (IFK) approach is used to encode feature vectors so as to generate a single feature vector for each volume, which can then be fed into a classifier generator. The principal advantage offered by the framework is that it does not require the detection (segmentation) of specific objects within the input data. The nature of the framework is fully described. A wide range of experiments was conducted with which to analyse the operation of the proposed framework and these are also reported fully in the paper. Although the proposed approach is generally applicable to 3D volumetric images, the focus for the work is 3D retinal Optical Coherence Tomography (OCT) images in the context of the diagnosis of Age-related Macular Degeneration (AMD). The results indicate that excellent diagnostic predictions can be produced using the proposed framework. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Closed-Loop Lifecycle Management of Service and Product in the Internet of Things: Semantic Framework for Knowledge Integration.

    PubMed

    Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris

    2016-07-08

    This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) BACKGROUND: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) METHODS: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) RESULTS: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) CONCLUSION: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database.

  9. Closed-Loop Lifecycle Management of Service and Product in the Internet of Things: Semantic Framework for Knowledge Integration

    PubMed Central

    Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris

    2016-01-01

    This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) Background: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) Methods: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) Results: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) Conclusion: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database. PMID:27399717

  10. A novel framework to evaluate pedestrian safety at non-signalized locations.

    PubMed

    Fu, Ting; Miranda-Moreno, Luis; Saunier, Nicolas

    2018-02-01

    This paper proposes a new framework to evaluate pedestrian safety at non-signalized crosswalk locations. In the proposed framework, the yielding maneuver of a driver in response to a pedestrian is split into the reaction and braking time. Hence, the relationship of the distance required for a yielding maneuver and the approaching vehicle speed depends on the reaction time of the driver and deceleration rate that the vehicle can achieve. The proposed framework is represented in the distance-velocity (DV) diagram and referred as the DV model. The interactions between approaching vehicles and pedestrians showing the intention to cross are divided in three categories: i) situations where the vehicle cannot make a complete stop, ii) situations where the vehicle's ability to stop depends on the driver reaction time, and iii) situations where the vehicle can make a complete stop. Based on these classifications, non-yielding maneuvers are classified as "non-infraction non-yielding" maneuvers, "uncertain non-yielding" maneuvers and "non-yielding" violations, respectively. From the pedestrian perspective, crossing decisions are classified as dangerous crossings, risky crossings and safe crossings accordingly. The yielding compliance and yielding rate, as measures of the yielding behavior, are redefined based on these categories. Time to crossing and deceleration rate required for the vehicle to stop are used to measure the probability of collision. Finally, the framework is demonstrated through a case study in evaluating pedestrian safety at three different types of non-signalized crossings: a painted crosswalk, an unprotected crosswalk, and a crosswalk controlled by stop signs. Results from the case study suggest that the proposed framework works well in describing pedestrian-vehicle interactions which helps in evaluating pedestrian safety at non-signalized crosswalk locations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. An examination of the spatial variability of the United States surface water balance using the Budyko relationship for current and projected climates

    NASA Astrophysics Data System (ADS)

    Ficklin, D. L.; Abatzoglou, J. T.

    2017-12-01

    The spatial variability in the balance between surface runoff (Q) and evapotranspiration (ET) is critical for understanding water availability. The Budyko framework suggests that this balance is solely a function of aridity. Observed deviations from this framework for individual watersheds, however, can vary significantly, resulting in uncertainty in using the Budyko framework in ungauged catchments and under future climate and land use scenarios. Here, we model the spatial variability in the partitioning of precipitation into Q and ET using a set of climatic, physiographic, and vegetation metrics for 211 near-natural watersheds across the contiguous United States (CONUS) within Budyko's framework through the free parameter ω. Using a generalized additive model, we found that precipitation seasonality, the ratio of soil water holding capacity to precipitation, topographic slope, and the fraction of precipitation falling as snow explained 81.2% of the variability in ω. This ω model applied to the Budyko framework explained 97% of the spatial variability in long-term Q for an independent set of near-natural watersheds. The developed ω model was also used to estimate the entire CONUS surface water balance for both contemporary and mid-21st century conditions. The contemporary CONUS surface water balance compared favorably to more sophisticated land-surface modeling efforts. For mid-21st century conditions, the model simulated an increase in the fraction of precipitation used by ET across the CONUS with declines in Q for much of the eastern CONUS and mountainous watersheds across the western US. The Budyko framework using the modeled ω lends itself to an alternative approach for assessing the potential response of catchment water balance to climate change to complement other approaches.

  12. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    PubMed Central

    2012-01-01

    Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878

  13. Why the South African NQF Failed: Lessons for Countries Wanting to Introduce National Qualifications Frameworks

    ERIC Educational Resources Information Center

    Allais, Stephanie Matseleng

    2007-01-01

    This article examines the South African National Qualifications Framework as a case study of a particular approach to the design of qualifications frameworks, which revolves around the specification of learning outcomes separate from educational institutions or programmes. It shows how an outcomes-led qualifications framework was seen as a…

  14. Adolescent pregnancies in the Amazon Basin of Ecuador: a rights and gender approach to adolescents' sexual and reproductive health

    PubMed Central

    Goicolea, Isabel

    2010-01-01

    In the Andean region of Latin America over one million adolescent girls get pregnant every year. Adolescent pregnancy (AP) has been associated with adverse health and social outcomes, but it has also been favorably viewed as a pathway to adulthood. AP can also be conceptualized as a marker of inequity, since it disproportionately affects girls from the poorest households and those who have not been able to attend school. Using results from a study carried out in the Amazon Basin of Ecuador, this paper explores APs and adolescents' sexual and reproductive health from a rights and gender approach. The paper points out the main features of a rights and gender approach, and how it can be applied to explore APs. Afterward it describes the methodologies (quantitative and qualitative) and main results of the study, framing the findings within the rights and gender approach. Finally, some implications that could be generalizable to global reserach on APs are highlighted. The application of the rights and gender framework to explore APs contributes to a more integral view of the issue. The rights and gender framework stresses the importance of the interaction between rights-holders and duty-bearers on the realization of sexual and reproductive rights, and acknowledges the importance of gender–power relations on sexual and reproductive decisions. A rights and gender approach could lead to more integral and constructive interventions, and it could also be useful when exploring other sexual and reproductive health matters. PMID:20596248

  15. Adolescent pregnancies in the Amazon Basin of Ecuador: a rights and gender approach to adolescents' sexual and reproductive health.

    PubMed

    Goicolea, Isabel

    2010-06-24

    In the Andean region of Latin America over one million adolescent girls get pregnant every year. Adolescent pregnancy (AP) has been associated with adverse health and social outcomes, but it has also been favorably viewed as a pathway to adulthood. AP can also be conceptualized as a marker of inequity, since it disproportionately affects girls from the poorest households and those who have not been able to attend school.Using results from a study carried out in the Amazon Basin of Ecuador, this paper explores APs and adolescents' sexual and reproductive health from a rights and gender approach. The paper points out the main features of a rights and gender approach, and how it can be applied to explore APs. Afterward it describes the methodologies (quantitative and qualitative) and main results of the study, framing the findings within the rights and gender approach. Finally, some implications that could be generalizable to global reserach on APs are highlighted.The application of the rights and gender framework to explore APs contributes to a more integral view of the issue. The rights and gender framework stresses the importance of the interaction between rights-holders and duty-bearers on the realization of sexual and reproductive rights, and acknowledges the importance of gender-power relations on sexual and reproductive decisions. A rights and gender approach could lead to more integral and constructive interventions, and it could also be useful when exploring other sexual and reproductive health matters.

  16. Robust Decision Making Approach to Managing Water Resource Risks (Invited)

    NASA Astrophysics Data System (ADS)

    Lempert, R.

    2010-12-01

    The IPCC and US National Academies of Science have recommended iterative risk management as the best approach for water management and many other types of climate-related decisions. Such an approach does not rely on a single set of judgments at any one time but rather actively updates and refines strategies as new information emerges. In addition, the approach emphasizes that a portfolio of different types of responses, rather than any single action, often provides the best means to manage uncertainty. Implementing an iterative risk management approach can however prove difficult in actual decision support applications. This talk will suggest that robust decision making (RDM) provides a particularly useful set of quantitative methods for implementing iterative risk management. This RDM approach is currently being used in a wide variety of water management applications. RDM employs three key concepts that differentiate it from most types of probabilistic risk analysis: 1) characterizing uncertainty with multiple views of the future (which can include sets of probability distributions) rather than a single probabilistic best-estimate, 2) employing a robustness rather than an optimality criterion to assess alternative policies, and 3) organizing the analysis with a vulnerability and response option framework, rather than a predict-then-act framework. This talk will summarize the RDM approach, describe its use in several different types of water management applications, and compare the results to those obtained with other methods.

  17. Multilevel analysis of sports video sequences

    NASA Astrophysics Data System (ADS)

    Han, Jungong; Farin, Dirk; de With, Peter H. N.

    2006-01-01

    We propose a fully automatic and flexible framework for analysis and summarization of tennis broadcast video sequences, using visual features and specific game-context knowledge. Our framework can analyze a tennis video sequence at three levels, which provides a broad range of different analysis results. The proposed framework includes novel pixel-level and object-level tennis video processing algorithms, such as a moving-player detection taking both the color and the court (playing-field) information into account, and a player-position tracking algorithm based on a 3-D camera model. Additionally, we employ scene-level models for detecting events, like service, base-line rally and net-approach, based on a number real-world visual features. The system can summarize three forms of information: (1) all court-view playing frames in a game, (2) the moving trajectory and real-speed of each player, as well as relative position between the player and the court, (3) the semantic event segments in a game. The proposed framework is flexible in choosing the level of analysis that is desired. It is effective because the framework makes use of several visual cues obtained from the real-world domain to model important events like service, thereby increasing the accuracy of the scene-level analysis. The paper presents attractive experimental results highlighting the system efficiency and analysis capabilities.

  18. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  19. An Integrated Assessment Framework for land subsidence in Delta cities

    NASA Astrophysics Data System (ADS)

    Bucx, T.; van Ruiten, K.; Erkens, G.

    2013-12-01

    In many delta cities land subsidence exceeds absolute sea level rise up to a factor of ten. Without change, parts of Jakarta, Ho Chi Minh City, Bangkok and numerous other delta (and coastal) cities will sink below sea level. Increased flooding and also other wide¬spread impacts of land subsidence result already in damage of billions of dollars per year to roads, embankments, subsurface infrastructure and housing. Moreover the potential damage caused by increased flood risk is around the same amount of money. A major cause for severe land subsidence is excessive groundwater extraction related to rapid urbanization and population growth. A major rethink is needed to resolve the ';hidden' but urgent threat of subsidence in a multi-sectoral perspective. A comprehensive approach is presented to address land subsidence for more sustainable and resilient urban development. Land subsidence is an issue that involves many policy fields, complex technical aspects and governance. There is a need for an integrated approach in order to manage subsidence and to develop appropriate strategies and measures that are effective and efficient on both the short and long term. Urban (ground)water management, adaptive flood risk management and related spatial planning strategies should be taken into account. This presentation will introduce and illustrate an Integrated Assessment Framework (IAF) for land subsidence that has been developed in the European FP7 project Subcoast. This framework is based on an integrated (multi-sectoral) approach and can be used to gain insight in the complex aspects of subsidence, to raise awareness and to support decision making on appropriate adaptation strategies and measures. The IAF is addressing all aspects of subsidence: from primary causes, vulnerability, impacts and risks towards responses and solutions. It will also take into account the three spatial layers (Occupation, Network and Base layer), governance aspects and several scenarios (economic and/or climate change). Main questions to be addressed in an integrated approach: what are the main causes, how much is the current subsidence rate and what are future scenarios (and interaction with other major environmental issues), where are the vulnerable areas, what are the impacts and risks, how can adverse impacts can be mitigated or compensated for, and who is involved and responsible to act? In five case studies a quick-assessment of land subsidence is performed based on this Integrated Assessment Framework. The case studies involve the following mega-cities: Jakarta, Ho Chi Minh City, Dhaka, New Orleans and Bangkok. Results of these case studies will be presented in order to further develop and support a (generic) approach how to deal with subsidence in current and future subsidence-prone areas. Integrated Assessment Framework by Deltares

  20. A deliberative framework to identify the need for real-life evidence building of new cancer drugs after interim funding decision.

    PubMed

    Leung, Leanne; de Lemos, Mário L; Kovacic, Laurel

    2017-01-01

    Background With the rising cost of new oncology treatments, it is no longer sustainable to base initial drug funding decisions primarily on prospective clinical trials as their performance in real-life populations are often difficult to determine. In British Columbia, an approach in evidence building is to retrospectively analyse patient outcomes using observational research on an ad hoc basis. Methods The deliberative framework was constructed in three stages: framework design, framework validation and treatment programme characterization, and key informant interview. Framework design was informed through a literature review and analyses of provincial and national decision-making processes. Treatment programmes funded between 2010 and 2013 were used for framework validation. A selection concordance rate of 80% amongst three reviewers was considered to be a validation of the framework. Key informant interviews were conducted to determine the utility of this deliberative framework. Results A multi-domain deliberative framework with 15 assessment parameters was developed. A selection concordance rate of 84.2% was achieved for content validation of the framework. Nine treatment programmes from five different tumour groups were selected for retrospective outcomes analysis. Five contributory factors to funding uncertainties were identified. Key informants agreed that the framework is a comprehensive tool that targets the key areas involved in the funding decision-making process. Conclusions The oncology-based deliberative framework can be routinely used to assess treatment programmes from the major tumour sites for retrospective outcomes analysis. Key informants indicate this is a value-added tool and will provide insight to the current prospective funding model.

Top