Sample records for integrated analytical framework

  1. The path dependency theory: analytical framework to study institutional integration. The case of France.

    PubMed

    Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique

    2010-06-30

    The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France.

  2. Using Learning Analytics for Preserving Academic Integrity

    ERIC Educational Resources Information Center

    Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena

    2017-01-01

    This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…

  3. The path dependency theory: analytical framework to study institutional integration. The case of France

    PubMed Central

    Trouvé, Hélène; Couturier, Yves; Etheridge, Francis; Saint-Jean, Olivier; Somme, Dominique

    2010-01-01

    Background The literature on integration indicates the need for an enhanced theorization of institutional integration. This article proposes path dependence as an analytical framework to study the systems in which integration takes place. Purpose PRISMA proposes a model for integrating health and social care services for older adults. This model was initially tested in Quebec. The PRISMA France study gave us an opportunity to analyze institutional integration in France. Methods A qualitative approach was used. Analyses were based on semi-structured interviews with actors of all levels of decision-making, observations of advisory board meetings, and administrative documents. Results Our analyses revealed the complexity and fragmentation of institutional integration. The path dependency theory, which analyzes the change capacity of institutions by taking into account their historic structures, allows analysis of this situation. The path dependency to the Bismarckian system and the incomplete reforms of gerontological policies generate the coexistence and juxtaposition of institutional systems. In such a context, no institution has sufficient ability to determine gerontology policy and build institutional integration by itself. Conclusion Using path dependence as an analytical framework helps to understand the reasons why institutional integration is critical to organizational and clinical integration, and the complex construction of institutional integration in France. PMID:20689740

  4. Integrated corridor management initiative : demonstration phase evaluation - final national evaluation framework.

    DOT National Transportation Integrated Search

    2012-05-01

    This report provides an analytical framework for evaluating the two field deployments under the United States Department of Transportation (U.S. DOT) Integrated Corridor Management (ICM) Initiative Demonstration Phase. The San Diego Interstate 15 cor...

  5. Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework

    ERIC Educational Resources Information Center

    Ranjan, Jayanthi; Bhatnagar, Vishal

    2011-01-01

    Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…

  6. Methods for integrating moderation and mediation: a general analytical framework using moderated path analysis.

    PubMed

    Edwards, Jeffrey R; Lambert, Lisa Schurer

    2007-03-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated and the mediated effects under investigation. This article presents a general analytical framework for combining moderation and mediation that integrates moderated regression analysis and path analysis. This framework clarifies how moderator variables influence the paths that constitute the direct, indirect, and total effects of mediated models. The authors empirically illustrate this framework and give step-by-step instructions for estimation and interpretation. They summarize the advantages of their framework over current approaches, explain how it subsumes moderated mediation and mediated moderation, and describe how it can accommodate additional moderator and mediator variables, curvilinear relationships, and structural equation models with latent variables. (c) 2007 APA, all rights reserved.

  7. Thinking graphically: Connecting vision and cognition during graph comprehension.

    PubMed

    Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A

    2008-03-01

    Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved

  8. Principles of Catholic Social Teaching, Critical Pedagogy, and the Theory of Intersectionality: An Integrated Framework to Examine the Roles of Social Status in the Formation of Catholic Teachers

    ERIC Educational Resources Information Center

    Eick, Caroline Marie; Ryan, Patrick A.

    2014-01-01

    This article discusses the relevance of an analytic framework that integrates principles of Catholic social teaching, critical pedagogy, and the theory of intersectionality to explain attitudes toward marginalized youth held by Catholic students preparing to become teachers. The framework emerges from five years of action research data collected…

  9. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    ERIC Educational Resources Information Center

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  10. Two logics of policy intervention in immigrant integration: an institutionalist framework based on capabilities and aspirations.

    PubMed

    Lutz, Philipp

    2017-01-01

    The effectiveness of immigrant integration policies has gained considerable attention across Western democracies dealing with ethnically and culturally diverse societies. However, the findings on what type of policy produces more favourable integration outcomes remain inconclusive. The conflation of normative and analytical assumptions on integration is a major challenge for causal analysis of integration policies. This article applies actor-centered institutionalism as a new framework for the analysis of immigrant integration outcomes in order to separate two different mechanisms of policy intervention. Conceptualising integration outcomes as a function of capabilities and aspirations allows separating assumptions on the policy intervention in assimilation and multiculturalism as the two main types of policy approaches. The article illustrates that assimilation is an incentive-based policy and primarily designed to increase immigrants' aspirations, whereas multiculturalism is an opportunity-based policy and primarily designed to increase immigrants' capabilities. Conceptualising causal mechanisms of policy intervention clarifies the link between normative concepts of immigrant integration and analytical concepts of policy effectiveness.

  11. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  12. Functional Analytic Psychotherapy Is a Framework for Implementing Evidence-Based Practices: The Example of Integrated Smoking Cessation and Depression Treatment

    ERIC Educational Resources Information Center

    Holman, Gareth; Kohlenberg, Robert J.; Tsai, Mavis; Haworth, Kevin; Jacobson, Emily; Liu, Sarah

    2012-01-01

    Depression and cigarette smoking are recurrent, interacting problems that co-occur at high rates and--especially when depression is chronic--are difficult to treat and associated with costly health consequences. In this paper we present an integrative therapeutic framework for concurrent treatment of these problems based on evidence-based…

  13. A Strategy for Incorporating Learning Analytics into the Design and Evaluation of a K-12 Science Curriculum

    ERIC Educational Resources Information Center

    Monroy, Carlos; Rangel, Virginia Snodgrass; Whitaker, Reid

    2014-01-01

    In this paper, we discuss a scalable approach for integrating learning analytics into an online K-12 science curriculum. A description of the curriculum and the underlying pedagogical framework is followed by a discussion of the challenges to be tackled as part of this integration. We include examples of data visualization based on teacher usage…

  14. Thinking Graphically: Connecting Vision and Cognition during Graph Comprehension

    ERIC Educational Resources Information Center

    Ratwani, Raj M.; Trafton, J. Gregory; Boehm-Davis, Deborah A.

    2008-01-01

    Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive…

  15. Institutional Racist Melancholia: A Structural Understanding of Grief and Power in Schooling

    ERIC Educational Resources Information Center

    Vaught, Sabina E.

    2012-01-01

    In this article, Sabina Vaught undertakes the theoretical and analytical project of conceptually integrating "Whiteness as property", a key structural framework of Critical Race Theory (CRT), and "melancholia", a framework originally emerging from psychoanalysis. Specifically, Vaught engages "Whiteness as property" as…

  16. Analytical method of waste allocation in waste management systems: Concept, method and case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergeron, Francis C., E-mail: francis.b.c@videotron.ca

    Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. Themore » conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste management in Geneva.« less

  17. Quantifying risks with exact analytical solutions of derivative pricing distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin

    2017-04-01

    Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.

  18. Competency Analytics Tool: Analyzing Curriculum Using Course Competencies

    ERIC Educational Resources Information Center

    Gottipati, Swapna; Shankararaman, Venky

    2018-01-01

    The applications of learning outcomes and competency frameworks have brought better clarity to engineering programs in many universities. Several frameworks have been proposed to integrate outcomes and competencies into course design, delivery and assessment. However, in many cases, competencies are course-specific and their overall impact on the…

  19. A Methodological Framework to Analyze Stakeholder Preferences and Propose Strategic Pathways for a Sustainable University

    ERIC Educational Resources Information Center

    Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda

    2016-01-01

    Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…

  20. Integration of targeted health interventions into health systems: a conceptual framework for analysis.

    PubMed

    Atun, Rifat; de Jongh, Thyra; Secci, Federica; Ohiri, Kelechi; Adeyi, Olusoji

    2010-03-01

    The benefits of integrating programmes that emphasize specific interventions into health systems to improve health outcomes have been widely debated. This debate has been driven by narrow binary considerations of integrated (horizontal) versus non-integrated (vertical) programmes, and characterized by polarization of views with protagonists for and against integration arguing the relative merits of each approach. The presence of both integrated and non-integrated programmes in many countries suggests benefits to each approach. While the terms 'vertical' and 'integrated' are widely used, they each describe a range of phenomena. In practice the dichotomy between vertical and horizontal is not rigid and the extent of verticality or integration varies between programmes. However, systematic analysis of the relative merits of integration in various contexts and for different interventions is complicated as there is no commonly accepted definition of 'integration'-a term loosely used to describe a variety of organizational arrangements for a range of programmes in different settings. We present an analytical framework which enables deconstruction of the term integration into multiple facets, each corresponding to a critical health system function. Our conceptual framework builds on theoretical propositions and empirical research in innovation studies, and in particular adoption and diffusion of innovations within health systems, and builds on our own earlier empirical research. It brings together the critical elements that affect adoption, diffusion and assimilation of a health intervention, and in doing so enables systematic and holistic exploration of the extent to which different interventions are integrated in varied settings and the reasons for the variation. The conceptual framework and the analytical approach we propose are intended to facilitate analysis in evaluative and formative studies of-and policies on-integration, for use in systematically comparing and contrasting health interventions in a country or in different settings to generate meaningful evidence to inform policy.

  1. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    NASA Astrophysics Data System (ADS)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  2. Methods for Integrating Moderation and Mediation: A General Analytical Framework Using Moderated Path Analysis

    ERIC Educational Resources Information Center

    Edwards, Jeffrey R.; Lambert, Lisa Schurer

    2007-01-01

    Studies that combine moderation and mediation are prevalent in basic and applied psychology research. Typically, these studies are framed in terms of moderated mediation or mediated moderation, both of which involve similar analytical approaches. Unfortunately, these approaches have important shortcomings that conceal the nature of the moderated…

  3. Earthdata Cloud Analytics Project

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Chris

    2018-01-01

    This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.

  4. Understanding public perceptions of biotechnology through the "Integrative Worldview Framework".

    PubMed

    De Witt, Annick; Osseweijer, Patricia; Pierce, Robin

    2015-07-03

    Biotechnological innovations prompt a range of societal responses that demand understanding. Research has shown such responses are shaped by individuals' cultural worldviews. We aim to demonstrate how the Integrative Worldview Framework (IWF) can be used for analyzing perceptions of biotechnology, by reviewing (1) research on public perceptions of biotechnology and (2) analyses of the stakeholder-debate on the bio-based economy, using the Integrative Worldview Framework (IWF) as analytical lens. This framework operationalizes the concept of worldview and distinguishes between traditional, modern, and postmodern worldviews, among others. Applied to these literatures, this framework illuminates how these worldviews underlie major societal responses, thereby providing a unifying understanding of the literature on perceptions of biotechnology. We conclude the IWF has relevance for informing research on perceptions of socio-technical changes, generating insight into the paradigmatic gaps in social science, and facilitating reflexive and inclusive policy-making and debates on these timely issues. © The Author(s) 2015.

  5. An Analytical Framework for the Cross-Country Comparison of Higher Education Governance

    ERIC Educational Resources Information Center

    Dobbins, Michael; Knill, Christoph; Vogtle, Eva Maria

    2011-01-01

    In this article we provide an integrated framework for the analysis of higher education governance which allows us to more systematically trace the changes that European higher education systems are currently undergoing. We argue that, despite highly insightful previous analyses, there is a need for more specific empirically observable indicators…

  6. High School Students' Informal Reasoning on a Socio-Scientific Issue: Qualitative and Quantitative Analyses

    ERIC Educational Resources Information Center

    Wu, Ying-Tien; Tsai, Chin-Chung

    2007-01-01

    Recently, the significance of learners' informal reasoning on socio-scientific issues has received increasing attention among science educators. To gain deeper insights into this important issue, an integrated analytic framework was developed in this study. With this framework, 71 Grade 10 students' informal reasoning about nuclear energy usage…

  7. Advantages of Integrative Data Analysis for Developmental Research

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Curran, Patrick J.

    2015-01-01

    Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…

  8. A Framework for Integrating Environmental Justice in Regulatory Analysis

    PubMed Central

    Nweke, Onyemaechi C.

    2011-01-01

    With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235

  9. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    PubMed

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative analysis of heterogeneous data types in the development of complex botanicals such as polyphenols for eventual clinical and translational applications.

  10. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  11. Integrated Analytic and Linearized Inverse Kinematics for Precise Full Body Interactions

    NASA Astrophysics Data System (ADS)

    Boulic, Ronan; Raunhardt, Daniel

    Despite the large success of games grounded on movement-based interactions the current state of full body motion capture technologies still prevents the exploitation of precise interactions with complex environments. This paper focuses on ensuring a precise spatial correspondence between the user and the avatar. We build upon our past effort in human postural control with a Prioritized Inverse Kinematics framework. One of its key advantage is to ease the dynamic combination of postural and collision avoidance constraints. However its reliance on a linearized approximation of the problem makes it vulnerable to the well-known full extension singularity of the limbs. In such context the tracking performance is reduced and/or less believable intermediate postural solutions are produced. We address this issue by introducing a new type of analytic constraint that smoothly integrates within the prioritized Inverse Kinematics framework. The paper first recalls the background of full body 3D interactions and the advantages and drawbacks of the linearized IK solution. Then the Flexion-EXTension constraint (FLEXT in short) is introduced for the partial position control of limb-like articulated structures. Comparative results illustrate the interest of this new type of integrated analytical and linearized IK control.

  12. Design of a framework for modeling, integration and simulation of physiological models.

    PubMed

    Erson, E Zeynep; Cavuşoğlu, M Cenk

    2012-09-01

    Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. Social Capital Theory: A Cross-Cutting Analytic for Teacher/Therapist Work in Integrating Children's Services?

    ERIC Educational Resources Information Center

    Forbes, Joan; McCartney, Elspeth

    2010-01-01

    Reviewing relevant policy, this article argues that the current "integration interlude" is concerned with reformation of work relations to create new forms of "social capital". The conceptual framework of social capital has been used by government policy-makers and academic researchers to examine different types, configurations…

  14. Toward an Analytic Framework of Interdisciplinary Reasoning and Communication (IRC) Processes in Science

    NASA Astrophysics Data System (ADS)

    Shen, Ji; Sung, Shannon; Zhang, Dongmei

    2015-11-01

    Students need to think and work across disciplinary boundaries in the twenty-first century. However, it is unclear what interdisciplinary thinking means and how to analyze interdisciplinary interactions in teamwork. In this paper, drawing on multiple theoretical perspectives and empirical analysis of discourse contents, we formulate a theoretical framework that helps analyze interdisciplinary reasoning and communication (IRC) processes in interdisciplinary collaboration. Specifically, we propose four interrelated IRC processes-integration, translation, transfer, and transformation, and develop a corresponding analytic framework. We apply the framework to analyze two meetings of a project that aims to develop interdisciplinary science assessment items. The results illustrate that the framework can help interpret the interdisciplinary meeting dynamics and patterns. Our coding process and results also suggest that these IRC processes can be further examined in terms of interconnected sub-processes. We also discuss the implications of using the framework in conceptualizing, practicing, and researching interdisciplinary learning and teaching in science education.

  15. What is Informal Learning and What are its Antecedents? An Integrative and Meta-Analytic Review

    DTIC Science & Technology

    2014-07-01

    formal training. Unfortunately, theory and research surrounding informal learning remains fragmented. Given that there has been little systematic...future-oriented. Applying this framework, the construct domain of informal learning in organizations is articulated. Second, an interactionist theory ...theoretical framework and outline an agenda for future theory development, research, and application of informal learning principles in organizations

  16. Integrated corridor management analysis, modeling and simulation (AMS) methodology.

    DOT National Transportation Integrated Search

    2008-03-01

    This AMS Methodologies Document provides a discussion of potential ICM analytical approaches for the assessment of generic corridor operations. The AMS framework described in this report identifies strategies and procedures for tailoring AMS general ...

  17. How to evaluate population management? Transforming the Care Continuum Alliance population health guide toward a broadly applicable analytical framework.

    PubMed

    Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A

    2015-04-01

    Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. the Integrated Science Textbooks in China

    ERIC Educational Resources Information Center

    Wei, Bing; Li, Yue; Chen, Bo

    2013-01-01

    This study aimed to examine the representations of nature of science (NOS) in the eight histories of science selected from three series of integrated science textbooks used in junior high school in China. Ten aspects of NOS were adopted in the analytical framework. It was found that NOS had not been well treated in the selected histories of…

  19. Technical Note for 8D Likelihood Effective Higgs Couplings Extraction Framework in the Golden Channel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yi; Di Marco, Emanuele; Lykken, Joe

    2014-10-17

    In this technical note we present technical details on various aspects of the framework introduced in arXiv:1401.2077 aimed at extracting effective Higgs couplings in themore » $$h\\to 4\\ell$$ `golden channel'. Since it is the primary feature of the framework, we focus in particular on the convolution integral which takes us from `truth' level to `detector' level and the numerical and analytic techniques used to obtain it. We also briefly discuss other aspects of the framework.« less

  20. The Relational Impact of Multiple Sclerosis: An Integrative Review of the Literature Using a Cognitive Analytic Framework.

    PubMed

    Blundell Jones, Joanna; Walsh, Sue; Isaac, Claire

    2017-12-01

    This integrative literature review uses cognitive analytic therapy (CAT) theory to examine the impact of a chronic illness, multiple sclerosis (MS), on relationships and mental health. Electronic searches were conducted in six medical and social science databases. Thirty-eight articles met inclusion criteria, and also satisfied quality criteria. Articles revealed that MS-related demands change care needs and alter relationships. Using a CAT framework, the MS literature was analysed, and five key patterns of relating to oneself and to others were identified. A diagrammatic formulation is proposed that interconnects these patterns with wellbeing and suggests potential "exits" to improve mental health, for example, assisting families to minimise overprotection. Application of CAT analysis to the literature clarifies relational processes that may affect mental health among individuals with MS, which hopefully will inform how services assist in reducing unhelpful patterns and improve coping. Further investigation of the identified patterns is needed.

  1. Optimizing cosmological surveys in a crowded market

    NASA Astrophysics Data System (ADS)

    Bassett, Bruce A.

    2005-04-01

    Optimizing the major next-generation cosmological surveys (such as SNAP, KAOS, etc.) is a key problem given our ignorance of the physics underlying cosmic acceleration and the plethora of surveys planned. We propose a Bayesian design framework which (1) maximizes the discrimination power of a survey without assuming any underlying dark-energy model, (2) finds the best niche survey geometry given current data and future competing experiments, (3) maximizes the cross section for serendipitous discoveries and (4) can be adapted to answer specific questions (such as “is dark energy dynamical?”). Integrated parameter-space optimization (IPSO) is a design framework that integrates projected parameter errors over an entire dark energy parameter space and then extremizes a figure of merit (such as Shannon entropy gain which we show is stable to off-diagonal covariance matrix perturbations) as a function of survey parameters using analytical, grid or MCMC techniques. We discuss examples where the optimization can be performed analytically. IPSO is thus a general, model-independent and scalable framework that allows us to appropriately use prior information to design the best possible surveys.

  2. Integrated Talent Management Enterprise as a Framework for Future Army Talent Management

    DTIC Science & Technology

    2015-06-12

    http://usacac.army.mil/CAC2/MilitaryReview/Archives/ English ...hav e pockets of innovative TM practices that it should bolster? 04. What tools (big dat a, pred ictive analytics, etc.) and techniques (customized

  3. A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses

    PubMed Central

    Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert

    2011-01-01

    Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325

  4. Metal-Organic Framework Modified Glass Substrate for Analysis of Highly Volatile Chemical Warfare Agents by Paper Spray Mass Spectrometry.

    PubMed

    Dhummakupt, Elizabeth S; Carmany, Daniel O; Mach, Phillip M; Tovar, Trenton M; Ploskonka, Ann M; Demond, Paul S; DeCoste, Jared B; Glaros, Trevor

    2018-03-07

    Paper spray mass spectrometry has been shown to successfully analyze chemical warfare agent (CWA) simulants. However, due to the volatility differences between the simulants and real G-series (i.e., sarin, soman) CWAs, analysis from an untreated paper substrate proved difficult. To extend the analytical lifetime of these G-agents, metal-organic frameworks (MOFs) were successfully integrated onto the paper spray substrates to increase adsorption and desorption. In this study, several MOFs and nanoparticles were tested to extend the analytical lifetimes of sarin, soman, and cyclosarin on paper spray substrates. It was found that the addition of either UiO-66 or HKUST-1 to the paper substrate increased the analytical lifetime of the G-agents from less than 5 min detectability to at least 50 min.

  5. Predicted range expansion of Chinese tallow tree (Triadica sebifera) in forestlands of the southern United States

    Treesearch

    Hsiao-Hsuan Wang; William Grant; Todd Swannack; Jianbang Gan; William Rogers; Tomasz Koralewski; James Miller; John W. Taylor Jr.

    2011-01-01

    We present an integrated approach for predicting future range expansion of an invasive species (Chinese tallow tree) that incorporates statistical forecasting and analytical techniques within a spatially explicit, agent-based, simulation framework.

  6. Social Exclusion and Education Inequality: Towards an Integrated Analytical Framework for the Urban-Rural Divide in China

    ERIC Educational Resources Information Center

    Wang, Li

    2012-01-01

    The aim of this paper is to build a capability-based framework, drawing upon the strengths of other approaches, which is applicable to the complexity of the urban-rural divide in education in China. It starts with a brief introduction to the capability approach. This is followed by a discussion of how the rights-based approach and resource-based…

  7. Discourse Markers in Chinese Conversational Narrative

    ERIC Educational Resources Information Center

    Xiao, Yang

    2010-01-01

    This study examines the indexicality of discourse markers (DMs) in Chinese conversational narrative. Drawing upon theoretical and methodological principles related to narrative dimensions (Ochs & Capps, 2001), narrative desires (Ochs, 1997, 2004), and narrative positioning (Bamberg, 1997), this work proposes an integrated analytical framework for…

  8. Preparing Pre-Service Teachers to Integrate Technology: An Analysis of the Emphasis on Digital Competence in Teacher Education Curricula

    ERIC Educational Resources Information Center

    Instefjord, Elen; Munthe, Elaine

    2016-01-01

    This article focuses on integration of digital competence in curriculum documents for teacher education in Norway. A model inspired by the work of Zhao, Pugh, Sheldon and Byers, as well as Krumsvik and Mishra and Koehler, has been developed as an analytical framework. Teachers' digital competence is here understood as comprising three knowledge…

  9. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  10. Boom Minimization Framework for Supersonic Aircraft Using CFD Analysis

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Rallabhandi, Sriram K.

    2010-01-01

    A new framework is presented for shape optimization using analytical shape functions and high-fidelity computational fluid dynamics (CFD) via Cart3D. The focus of the paper is the system-level integration of several key enabling analysis tools and automation methods to perform shape optimization and reduce sonic boom footprint. A boom mitigation case study subject to performance, stability and geometrical requirements is presented to demonstrate a subset of the capabilities of the framework. Lastly, a design space exploration is carried out to assess the key parameters and constraints driving the design.

  11. Triple collinear emissions in parton showers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Höche, Stefan; Prestel, Stefan

    2017-10-01

    A framework to include triple collinear splitting functions into parton showers is presented, and the implementation of flavor-changing NLO splitting kernels is discussed as a first application. The correspondence between the Monte-Carlo integration and the analytic computation of NLO DGLAP evolution kernels is made explicit for both timelike and spacelike parton evolution. Numerical simulation results are obtained with two independent implementations of the new algorithm, using the two independent event generation frameworks Pythia and Sherpa.

  12. Tiered Approach to Resilience Assessment.

    PubMed

    Linkov, Igor; Fox-Lent, Cate; Read, Laura; Allen, Craig R; Arnott, James C; Bellini, Emanuele; Coaffee, Jon; Florin, Marie-Valentine; Hatfield, Kirk; Hyde, Iain; Hynes, William; Jovanovic, Aleksandar; Kasperson, Roger; Katzenberger, John; Keys, Patrick W; Lambert, James H; Moss, Richard; Murdoch, Peter S; Palma-Oliveira, Jose; Pulwarty, Roger S; Sands, Dale; Thomas, Edward A; Tye, Mari R; Woods, David

    2018-04-25

    Regulatory agencies have long adopted a three-tier framework for risk assessment. We build on this structure to propose a tiered approach for resilience assessment that can be integrated into the existing regulatory processes. Comprehensive approaches to assessing resilience at appropriate and operational scales, reconciling analytical complexity as needed with stakeholder needs and resources available, and ultimately creating actionable recommendations to enhance resilience are still lacking. Our proposed framework consists of tiers by which analysts can select resilience assessment and decision support tools to inform associated management actions relative to the scope and urgency of the risk and the capacity of resource managers to improve system resilience. The resilience management framework proposed is not intended to supplant either risk management or the many existing efforts of resilience quantification method development, but instead provide a guide to selecting tools that are appropriate for the given analytic need. The goal of this tiered approach is to intentionally parallel the tiered approach used in regulatory contexts so that resilience assessment might be more easily and quickly integrated into existing structures and with existing policies. Published 2018. This article is a U.S. government work and is in the public domain in the USA.

  13. ENVIRONMENTAL SYSTEMS MANAGEMENT AS APPLIED TO WATERSHEDS, UTILIZING REMOTE SENSING, DECISION SUPPORT AND VISUALIZATION

    EPA Science Inventory

    Environmental Systems Management as a conceptual framework and as a set of interdisciplinary analytical approaches will be described within the context of sustainable watershed management, within devergent complex ecosystems. A specific subset of integrated tools are deployed to...

  14. Individual behavioral phenotypes: an integrative meta-theoretical framework. Why "behavioral syndromes" are not analogs of "personality".

    PubMed

    Uher, Jana

    2011-09-01

    Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.

  15. A one health framework for estimating the economic costs of zoonotic diseases on society.

    PubMed

    Narrod, Clare; Zinsstag, Jakob; Tiongco, Marites

    2012-06-01

    This article presents an integrated epidemiological and economic framework for assessing zoonoses using a "one health" concept. The framework allows for an understanding of the cross-sector economic impact of zoonoses using modified risk analysis and detailing a range of analytical tools. The goal of the framework is to link the analysis outputs of animal and human disease transmission models, economic impact models and evaluation of risk management options to gain improved understanding of factors affecting the adoption of risk management strategies so that investment planning includes the most promising interventions (or sets of interventions) in an integrated fashion. A more complete understanding of the costs of the disease and the costs and benefits of control measures would promote broader implementation of the most efficient and effective control measures, contributing to improved animal and human health, better livelihood outcomes for the poor and macroeconomic growth.

  16. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  17. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options.

    PubMed

    Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D

    2013-10-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.

  18. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options☆

    PubMed Central

    Reed, M.S.; Podesta, G.; Fazey, I.; Geeson, N.; Hessel, R.; Hubacek, K.; Letson, D.; Nainggolan, D.; Prell, C.; Rickenbach, M.G.; Ritsema, C.; Schwilch, G.; Stringer, L.C.; Thomas, A.D.

    2013-01-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change. PMID:25844020

  19. Communicable disease control programmes and health systems: an analytical approach to sustainability.

    PubMed

    Shigayeva, Altynay; Coker, Richard J

    2015-04-01

    There is renewed concern over the sustainability of disease control programmes, and re-emergence of policy recommendations to integrate programmes with general health systems. However, the conceptualization of this issue has remarkably received little critical attention. Additionally, the study of programmatic sustainability presents methodological challenges. In this article, we propose a conceptual framework to support analyses of sustainability of communicable disease programmes. Through this work, we also aim to clarify a link between notions of integration and sustainability. As a part of development of the conceptual framework, we conducted a systematic literature review of peer-reviewed literature on concepts, definitions, analytical approaches and empirical studies on sustainability in health systems. Identified conceptual proposals for analysis of sustainability in health systems lack an explicit conceptualization of what a health system is. Drawing upon theoretical concepts originating in sustainability sciences and our review here, we conceptualize a communicable disease programme as a component of a health system which is viewed as a complex adaptive system. We propose five programmatic characteristics that may explain a potential for sustainability: leadership, capacity, interactions (notions of integration), flexibility/adaptability and performance. Though integration of elements of a programme with other system components is important, its role in sustainability is context specific and difficult to predict. The proposed framework might serve as a basis for further empirical evaluations in understanding complex interplay between programmes and broader health systems in the development of sustainable responses to communicable diseases. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  20. An Analytical Framework for Internationalization through English-Taught Degree Programs: A Dutch Case Study

    ERIC Educational Resources Information Center

    Kotake, Masako

    2017-01-01

    The growing importance of internationalization and the global dominance of English in higher education mean pressures on expanding English-taught degree programs (ETDPs) in non-English-speaking countries. Strategic considerations are necessary to successfully integrate ETDPs into existing programs and to optimize the effects of…

  1. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  2. Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics

    PubMed Central

    Hey, Jody; Nielsen, Rasmus

    2007-01-01

    In 1988, Felsenstein described a framework for assessing the likelihood of a genetic data set in which all of the possible genealogical histories of the data are considered, each in proportion to their probability. Although not analytically solvable, several approaches, including Markov chain Monte Carlo methods, have been developed to find approximate solutions. Here, we describe an approach in which Markov chain Monte Carlo simulations are used to integrate over the space of genealogies, whereas other parameters are integrated out analytically. The result is an approximation to the full joint posterior density of the model parameters. For many purposes, this function can be treated as a likelihood, thereby permitting likelihood-based analyses, including likelihood ratio tests of nested models. Several examples, including an application to the divergence of chimpanzee subspecies, are provided. PMID:17301231

  3. Triple collinear emissions in parton showers

    DOE PAGES

    Hoche, Stefan; Prestel, Stefan

    2017-10-17

    A framework to include triple collinear splitting functions into parton showers is presented, and the implementation of flavor-changing next-to-leading-order (NLO) splitting kernels is discussed as a first application. The correspondence between the Monte Carlo integration and the analytic computation of NLO DGLAP evolution kernels is made explicit for both timelike and spacelike parton evolution. Finally, numerical simulation results are obtained with two independent implementations of the new algorithm, using the two independent event generation frameworks PYTHIA and SHERPA.

  4. Integrating Solar PV in Utility System Operations: Analytical Framework and Arizona Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jing; Botterud, Audun; Mills, Andrew

    2015-06-01

    A systematic framework is proposed to estimate the impact on operating costs due to uncertainty and variability in renewable resources. The framework quantifies the integration costs associated with subhourly variability and uncertainty as well as day-ahead forecasting errors in solar PV (photovoltaics) power. A case study illustrates how changes in system operations may affect these costs for a utility in the southwestern United States (Arizona Public Service Company). We conduct an extensive sensitivity analysis under different assumptions about balancing reserves, system flexibility, fuel prices, and forecasting errors. We find that high solar PV penetrations may lead to operational challenges, particularlymore » during low-load and high solar periods. Increased system flexibility is essential for minimizing integration costs and maintaining reliability. In a set of sensitivity cases where such flexibility is provided, in part, by flexible operations of nuclear power plants, the estimated integration costs vary between $1.0 and $4.4/MWh-PV for a PV penetration level of 17%. The integration costs are primarily due to higher needs for hour-ahead balancing reserves to address the increased sub-hourly variability and uncertainty in the PV resource. (C) 2015 Elsevier Ltd. All rights reserved.« less

  5. A Simple Demonstration of Concrete Structural Health Monitoring Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Sankaran; Agarwal, Vivek; Cai, Guowei

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements—damagemore » modeling, monitoring, data analytics, and uncertainty quantification. This report describes a proof-of-concept example on a small concrete slab subjected to a freeze-thaw experiment that explores techniques in each of the four elements of the framework and their integration. An experimental set-up at Vanderbilt University’s Laboratory for Systems Integrity and Reliability is used to research effective combination of full-field techniques that include infrared thermography, digital image correlation, and ultrasonic measurement. The measured data are linked to the probabilistic framework: the thermography, digital image correlation data, and ultrasonic measurement data are used for Bayesian calibration of model parameters, for diagnosis of damage, and for prognosis of future damage. The proof-of-concept demonstration presented in this report highlights the significance of each element of the framework and their integration.« less

  6. Integral Twist Actuation of Helicopter Rotor Blades for Vibration Reduction

    NASA Technical Reports Server (NTRS)

    Shin, SangJoon; Cesnik, Carlos E. S.

    2001-01-01

    Active integral twist control for vibration reduction of helicopter rotors during forward flight is investigated. The twist deformation is obtained using embedded anisotropic piezocomposite actuators. An analytical framework is developed to examine integrally-twisted blades and their aeroelastic response during different flight conditions: frequency domain analysis for hover, and time domain analysis for forward flight. Both stem from the same three-dimensional electroelastic beam formulation with geometrical-exactness, and axe coupled with a finite-state dynamic inflow aerodynamics model. A prototype Active Twist Rotor blade was designed with this framework using Active Fiber Composites as the actuator. The ATR prototype blade was successfully tested under non-rotating conditions. Hover testing was conducted to evaluate structural integrity and dynamic response. In both conditions, a very good correlation was obtained against the analysis. Finally, a four-bladed ATR system is built and tested to demonstrate its concept in forward flight. This experiment was conducted at NASA Langley Tansonic Dynamics Tunnel and represents the first-of-a-kind Mach-scaled fully-active-twist rotor system to undergo forward flight test. In parallel, the impact upon the fixed- and rotating-system loads is estimated by the analysis. While discrepancies are found in the amplitude of the loads under actuation, the predicted trend of load variation with respect to its control phase correlates well. It was also shown, both experimentally and numerically, that the ATR blade design has the potential for hub vibratory load reduction of up to 90% using individual blade control actuation. Using the numerical framework, system identification is performed to estimate the harmonic transfer functions. The linear time-periodic system can be represented by a linear time-invariant system under the three modes of blade actuation: collective, longitudinal cyclic, and lateral cyclic. A vibration minimizing controller is designed based on this result, which implements classical disturbance rejection algorithm with some modifications. The controller is simulated numerically, and more than 90% of the 4P hub vibratory load is eliminated. By accomplishing the experimental and analytical steps described in this thesis, the present concept is found to be a viable candidate for future generation low-vibration helicopters. Also, the analytical framework is shown to be very appropriate for exploring active blade designs, aeroelastic behavior prediction, and as simulation tool for closed-loop controllers.

  7. Tracking the debate around marine protected areas: key issues and the BEG framework.

    PubMed

    Thorpe, Andy; Bavinck, Maarten; Coulthard, Sarah

    2011-04-01

    Marine conservation is often criticized for a mono-disciplinary approach, which delivers fragmented solutions to complex problems with differing interpretations of success. As a means of reflecting on the breadth and range of scientific research on the management of the marine environment, this paper develops an analytical framework to gauge the foci of policy documents and published scientific work on Marine Protected Areas. We evaluate the extent to which MPA research articles delineate objectives around three domains: biological-ecological [B]; economic-social[E]; and governance-management [G]. This permits us to develop an analytic [BEG] framework which we then test on a sample of selected journal article cohorts. While the framework reveals the dominance of biologically focussed research [B], analysis also reveals a growing frequency of the use of governance/management terminology in the literature over the last 15 years, which may be indicative of a shift towards more integrated consideration of governance concerns. However, consideration of the economic/social domain appears to lag behind biological and governance concerns in both frequency and presence in MPA literature.

  8. CryoTran user's manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Cowgill, Glenn R.; Chato, David J.; Saad, Ehab

    1989-01-01

    The development of cryogenic fluid management systems for space operation is a major portion of the efforts of the Cryogenic Fluids Technology Office (CFTO) at the NASA Lewis Research Center. Analytical models are a necessary part of experimental programs which are used to verify the results of experiments and are also used as a predictor for parametric studies. The CryoTran computer program is a bridge to obtain analytical results. The object of CryoTran is to coordinate these separate analyses into an integrated framework with a user-friendly interface and a common cryogenic property database. CryoTran is an integrated software system designed to help solve a diverse set of problems involving cryogenic fluid storage and transfer in both ground and low-g environments.

  9. Linking climate change and fish conservation efforts using spatially explicit decision support tools

    Treesearch

    Douglas P. Peterson; Seth J. Wenger; Bruce E. Rieman; Daniel J. Isaak

    2013-01-01

    Fisheries professionals are increasingly tasked with incorporating climate change projections into their decisions. Here we demonstrate how a structured decision framework, coupled with analytical tools and spatial data sets, can help integrate climate and biological information to evaluate management alternatives. We present examples that link downscaled climate...

  10. Understanding Career Decision Self-Efficacy: A Meta-Analytic Approach

    ERIC Educational Resources Information Center

    Choi, Bo Young; Park, Heerak; Yang, Eunjoo; Lee, Seul Ki; Lee, Yedana; Lee, Sang Min

    2012-01-01

    This study used meta-analysis to investigate the relationships between career decision self-efficacy (CDSE) and its relevant variables. The authors aimed to integrate the mixed results reported by previous empirical studies and obtain a clearer understanding of CDSE's role within the framework of social cognitive career theory (SCCT). For purposes…

  11. An Integrated Customer Knowledge Management Framework for Academic Libraries

    ERIC Educational Resources Information Center

    Daneshgar, Farhad; Parirokh, Mehri

    2012-01-01

    The ability of academic libraries to produce timely and effective responses to various environmental changes constitutes a major challenge for them to enhance their survival rate and maintain growth in competitive environments. This article provides a conceptual model as an analytical tool for both improving current services as well as creating…

  12. Telecollaboration in the Secondary Language Classroom: Case Study of Adolescent Interaction and Pedagogical Integration

    ERIC Educational Resources Information Center

    Ware, Paige; Kessler, Greg

    2016-01-01

    This study builds on research examining the in-school technology practices of adolescent language learners by exploring the patterns of classroom literacy practices that emerge when a telecollaborative project is introduced into a conventional secondary language classroom. We draw on the conceptual frameworks and discourse analytical tools…

  13. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  14. Co-governing decentralised water systems: an analytical framework.

    PubMed

    Yu, C; Brown, R; Morison, P

    2012-01-01

    Current discourses in urban water management emphasise a diversity of water sources and scales of infrastructure for resilience and adaptability. During the last 2 decades, in particular, various small-scale systems emerged and developed so that the debate has largely moved from centralised versus decentralised water systems toward governing integrated and networked systems of provision and consumption where small-scale technologies are embedded in large-scale centralised infrastructures. However, while centralised systems have established boundaries of ownership and management, decentralised water systems (such as stormwater harvesting technologies for the street, allotment/house scales) do not, therefore the viability for adoption and/or continued use of decentralised water systems is challenged. This paper brings together insights from the literature on public sector governance, co-production and social practices model to develop an analytical framework for co-governing such systems. The framework provides urban water practitioners with guidance when designing co-governance arrangements for decentralised water systems so that these systems continue to exist, and become widely adopted, within the established urban water regime.

  15. Analytical mass formula and nuclear surface properties in the ETF approximation. Part I: symmetric nuclei

    NASA Astrophysics Data System (ADS)

    Aymard, François; Gulminelli, Francesca; Margueron, Jérôme

    2016-08-01

    The problem of determination of nuclear surface energy is addressed within the framework of the extended Thomas Fermi (ETF) approximation using Skyrme functionals. We propose an analytical model for the density profiles with variationally determined diffuseness parameters. In this first paper, we consider the case of symmetric nuclei. In this situation, the ETF functional can be exactly integrated, leading to an analytical formula expressing the surface energy as a function of the couplings of the energy functional. The importance of non-local terms is stressed and it is shown that they cannot be deduced simply from the local part of the functional, as it was suggested in previous works.

  16. Harmonizing community-based health worker programs for HIV: a narrative review and analytic framework.

    PubMed

    De Neve, Jan-Walter; Boudreaux, Chantelle; Gill, Roopan; Geldsetzer, Pascal; Vaikath, Maria; Bärnighausen, Till; Bossert, Thomas J

    2017-07-03

    Many countries have created community-based health worker (CHW) programs for HIV. In most of these countries, several national and non-governmental initiatives have been implemented raising questions of how well these different approaches address the health problems and use health resources in a compatible way. While these questions have led to a general policy initiative to promote harmonization across programs, there is a need for countries to develop a more coherent and organized approach to CHW programs and to generate evidence about the most efficient and effective strategies to ensure their optimal, sustained performance. We conducted a narrative review of the existing published and gray literature on the harmonization of CHW programs. We searched for and noted evidence on definitions, models, and/or frameworks of harmonization; theoretical arguments or hypotheses about the effects of CHW program fragmentation; and empirical evidence. Based on this evidence, we defined harmonization, introduced three priority areas for harmonization, and identified a conceptual framework for analyzing harmonization of CHW programs that can be used to support their expanding role in HIV service delivery. We identified and described the major issues and relationships surrounding the harmonization of CHW programs, including key characteristics, facilitators, and barriers for each of the priority areas of harmonization, and used our analytic framework to map overarching findings. We apply this approach of CHW programs supporting HIV services across four countries in Southern Africa in a separate article. There is a large number and immense diversity of CHW programs for HIV. This includes integration of HIV components into countries' existing national programs along with the development of multiple, stand-alone CHW programs. We defined (i) coordination among stakeholders, (ii) integration into the broader health system, and (iii) assurance of a CHW program's sustainability to be priority areas of harmonization. While harmonization is likely a complex political process, with in many cases incremental steps toward improvement, a wide range of facilitators are available to decision-makers. These can be categorized using an analytic framework assessing the (i) health issue, (ii) intervention itself, (iii) stakeholders, (iv) health system, and (v) broad context. There is a need to address fragmentation of CHW programs to advance and sustain CHW roles and responsibilities for HIV. This study provides a narrative review and analytic framework to understand the process by which harmonization of CHW programs might be achieved and to test the assumption that harmonization is needed to improve CHW performance.

  17. Integrating research tools to support the management of social-ecological systems under climate change

    USGS Publications Warehouse

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  18. A Digital Mixed Methods Research Design: Integrating Multimodal Analysis with Data Mining and Information Visualization for Big Data Analytics

    ERIC Educational Resources Information Center

    O'Halloran, Kay L.; Tan, Sabine; Pham, Duc-Son; Bateman, John; Vande Moere, Andrew

    2018-01-01

    This article demonstrates how a digital environment offers new opportunities for transforming qualitative data into quantitative data in order to use data mining and information visualization for mixed methods research. The digital approach to mixed methods research is illustrated by a framework which combines qualitative methods of multimodal…

  19. Learning on the Job: A Cultural Historical Activity Theory Approach to Initial Teacher Education across Four Secondary School Subject Departments

    ERIC Educational Resources Information Center

    Douglas, Alaster Scott

    2011-01-01

    This article considers how one may integrate ethnographic data generation with research questions and an analytic framework that are strongly theoretically informed by Cultural Historical Activity Theory (CHAT). Generating data through participant observation of school-based, student teacher education activity and interviewing all those involved…

  20. Validation of the OpCost logging cost model using contractor surveys

    Treesearch

    Conor K. Bell; Robert F. Keefe; Jeremy S. Fried

    2017-01-01

    OpCost is a harvest and fuel treatment operations cost model developed to function as both a standalone tool and an integrated component of the Bioregional Inventory Originated Simulation Under Management (BioSum) analytical framework for landscape-level analysis of forest management alternatives. OpCost is an updated implementation of the Fuel Reduction Cost Simulator...

  1. Operationalizing the Learning Health Care System in an Integrated Delivery System

    PubMed Central

    Psek, Wayne A.; Stametz, Rebecca A.; Bailey-Davis, Lisa D.; Davis, Daniel; Darer, Jonathan; Faucett, William A.; Henninger, Debra L.; Sellers, Dorothy C.; Gerrity, Gloria

    2015-01-01

    Introduction: The Learning Health Care System (LHCS) model seeks to utilize sophisticated technologies and competencies to integrate clinical operations, research and patient participation in order to continuously generate knowledge, improve care, and deliver value. Transitioning from concept to practical application of an LHCS presents many challenges but can yield opportunities for continuous improvement. There is limited literature and practical experience available in operationalizing the LHCS in the context of an integrated health system. At Geisinger Health System (GHS) a multi-stakeholder group is undertaking to enhance organizational learning and develop a plan for operationalizing the LHCS system-wide. We present a framework for operationalizing continuous learning across an integrated delivery system and lessons learned through the ongoing planning process. Framework: The framework focuses attention on nine key LHCS operational components: Data and Analytics; People and Partnerships; Patient and Family Engagement; Ethics and Oversight; Evaluation and Methodology; Funding; Organization; Prioritization; and Deliverables. Definitions, key elements and examples for each are presented. The framework is purposefully broad for application across different organizational contexts. Conclusion: A realistic assessment of the culture, resources and capabilities of the organization related to learning is critical to defining the scope of operationalization. Engaging patients in clinical care and discovery, including quality improvement and comparative effectiveness research, requires a defensible ethical framework that undergirds a system of strong but flexible oversight. Leadership support is imperative for advancement of the LHCS model. Findings from our ongoing work within the proposed framework may inform other organizations considering a transition to an LHCS. PMID:25992388

  2. Semiclassical description of resonance-assisted tunneling in one-dimensional integrable models

    NASA Astrophysics Data System (ADS)

    Le Deunff, Jérémy; Mouchet, Amaury; Schlagheck, Peter

    2013-10-01

    Resonance-assisted tunneling is investigated within the framework of one-dimensional integrable systems. We present a systematic recipe, based on Hamiltonian normal forms, to construct one-dimensional integrable models that exhibit resonance island chain structures with accurately controlled sizes and positions of the islands. Using complex classical trajectories that evolve along suitably defined paths in the complex time domain, we construct a semiclassical theory of the resonance-assisted tunneling process. This semiclassical approach yields a compact analytical expression for tunnelling-induced level splittings which is found to be in very good agreement with the exact splittings obtained through numerical diagonalization.

  3. Transient Effects in Planar Solidification of Dilute Binary Alloys

    NASA Technical Reports Server (NTRS)

    Mazuruk, Konstantin; Volz, Martin P.

    2008-01-01

    The initial transient during planar solidification of dilute binary alloys is studied in the framework of the boundary integral method that leads to the non-linear Volterra integral governing equation. An analytical solution of this equation is obtained for the case of a constant growth rate which constitutes the well-known Tiller's formula for the solute transient. The more physically relevant, constant ramping down temperature case has been studied both numerically and analytically. In particular, an asymptotic analytical solution is obtained for the initial transient behavior. A numerical technique to solve the non-linear Volterra equation is developed and the solution is obtained for a family of the governing parameters. For the rapid solidification condition, growth rate spikes have been observed even for the infinite kinetics model. When recirculating fluid flow is included into the analysis, the spike feature is dramatically diminished. Finally, we have investigated planar solidification with a fluctuating temperature field as a possible mechanism for frequently observed solute trapping bands.

  4. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  5. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  6. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.

  7. Multidisciplinary optimization in aircraft design using analytic technology models

    NASA Technical Reports Server (NTRS)

    Malone, Brett; Mason, W. H.

    1991-01-01

    An approach to multidisciplinary optimization is presented which combines the Global Sensitivity Equation method, parametric optimization, and analytic technology models. The result is a powerful yet simple procedure for identifying key design issues. It can be used both to investigate technology integration issues very early in the design cycle, and to establish the information flow framework between disciplines for use in multidisciplinary optimization projects using much more computational intense representations of each technology. To illustrate the approach, an examination of the optimization of a short takeoff heavy transport aircraft is presented for numerous combinations of performance and technology constraints.

  8. Self-contained image mapping of placental vasculature in 3D ultrasound-guided fetoscopy.

    PubMed

    Yang, Liangjing; Wang, Junchen; Ando, Takehiro; Kubota, Akihiro; Yamashita, Hiromasa; Sakuma, Ichiro; Chiba, Toshio; Kobayashi, Etsuko

    2016-09-01

    Surgical navigation technology directed at fetoscopic procedures is relatively underdeveloped compared with other forms of endoscopy. The narrow fetoscopic field of views and the vast vascular network on the placenta make examination and photocoagulation treatment of twin-to-twin transfusion syndrome challenging. Though ultrasonography is used for intraoperative guidance, its navigational ability is not fully exploited. This work aims to integrate 3D ultrasound imaging and endoscopic vision seamlessly for placental vasculature mapping through a self-contained framework without external navigational devices. This is achieved through development, integration, and experimentation of novel navigational modules. Firstly, a framework design that addresses the current limitations based on identified gaps is conceptualized. Secondly, integration of navigational modules including (1) ultrasound-based localization, (2) image alignment, and (3) vision-based tracking to update the scene texture map is implemented. This updated texture map is projected to an ultrasound-constructed 3D model for photorealistic texturing of the 3D scene creating a panoramic view of the moving fetoscope. In addition, a collaborative scheme for the integration of the modular workflow system is proposed to schedule updates in a systematic fashion. Finally, experiments are carried out to evaluate each modular variation and an integrated collaborative scheme of the framework. The modules and the collaborative scheme are evaluated through a series of phantom experiments with controlled trajectories for repeatability. The collaborative framework demonstrated the best accuracy (5.2 % RMS error) compared with all the three single-module variations during the experiment. Validation on an ex vivo monkey placenta shows visual continuity of the freehand fetoscopic panorama. The proposed developed collaborative framework and the evaluation study of the framework variations provide analytical insights for effective integration of ultrasonography and endoscopy. This contributes to the development of navigation techniques in fetoscopic procedures and can potentially be extended to other applications in intraoperative imaging.

  9. Normal stress differences from Oldroyd 8-constant framework: Exact analytical solution for large-amplitude oscillatory shear flow

    NASA Astrophysics Data System (ADS)

    Saengow, C.; Giacomin, A. J.

    2017-12-01

    The Oldroyd 8-constant framework for continuum constitutive theory contains a rich diversity of popular special cases for polymeric liquids. In this paper, we use part of our exact solution for shear stress to arrive at unique exact analytical solutions for the normal stress difference responses to large-amplitude oscillatory shear (LAOS) flow. The nonlinearity of the polymeric liquids, triggered by LAOS, causes these responses at even multiples of the test frequency. We call responses at a frequency higher than twice the test frequency higher harmonics. We find the new exact analytical solutions to be compact and intrinsically beautiful. These solutions reduce to those of our previous work on the special case of the corotational Maxwell fluid. Our solutions also agree with our new truncated Goddard integral expansion for the special case of the corotational Jeffreys fluid. The limiting behaviors of these exact solutions also yield new explicit expressions. Finally, we use our exact solutions to see how η∞ affects the normal stress differences in LAOS.

  10. Cross-Layer Modeling Framework for Energy-Efficient Resilience

    DTIC Science & Technology

    2014-04-01

    functional block diagram of the software architecture of PEARL, which stands for: Power Efficient and Resilient Embedded Processing with Real - Time ... DVFS ). The goal of the run- time manager is to minimize power consumption, while maintaining system resilience targets (on average) and meeting... real - time performance targets. The integrated performance, power and resilience models are nothing but the analytical modeling toolkit described in

  11. FIA BioSum: a tool to evaluate financial costs, opportunities and effectiveness of fuel treatments.

    Treesearch

    Jeremy Fried; Glenn Christensen

    2004-01-01

    FIA BioSum, a tool developed by the USDA Forest Services Forest Inventory and Analysis (FIA) Program, generates reliable cost estimates, identifies opportunities and evaluates the effectiveness of fuel treatments in forested landscapes. BioSum is an analytic framework that integrates a suite of widely used computer models with a foundation of attribute-rich,...

  12. Unleashing Our Untapped Domestic Collection is the Key to Prevention

    DTIC Science & Technology

    2007-09-01

    Information Center (NCIC), Uniform Crime Reporting (UCR), and Integrated Automated Fingerprint Identification System (IAFIS) fingerprint ...The Blue Ocean Strategy Canvas , as described by Kim and Mauborgne, is an analytical framework that is both diagnostic and action oriented...The authors argue the value of a strategy canvas is its ability to capture the current state, provide an understanding of various factors impacting

  13. Primary care and reform of health systems: a framework for the analysis of Latin American experiences.

    PubMed

    Frenk, J; González-Block, M A

    1992-03-01

    The article first proposes a framework within which to assess the potential of health sector reforms in Latin America for primary health care (PHC). Two dimensions are recognized: the scope of the reforms, content, and the means of participation that are put into play. This framework is then complemented through a critique of the often-sought but little-analyzed PHC reform strategies of decentralization and health sector integration. The analytical framework is next directed to the financing of health services, a chief aspect of any reform aiming toward PHC. Two facets of health service finance are first distinguished: its formal aspect as a means for economic subsistence and growth, and its substantive aspect as a means to promote the rational use of services and thus improvement of health. Once finance is understood in this microeconomic perspective, the focus shifts to the analysis of health care reforms at the macro, health policy level. The article concludes by positing that PHC is in essence a new health care paradigm, oriented by the values of universality, redistribution, integration, plurality, quality, and efficiency.

  14. Biosimilars: The US Regulatory Framework.

    PubMed

    Christl, Leah A; Woodcock, Janet; Kozlowski, Steven

    2017-01-14

    With the passage of the Biologics Price Competition and Innovation Act of 2009, the US Food and Drug Administration established an abbreviated pathway for developing and licensing biosimilar and interchangeable biological products. The regulatory framework and the technical requirements of the US biosimilars program involve a stepwise approach that relies heavily on analytical methods to demonstrate through a "totality of the evidence" that a proposed product is biosimilar to its reference product. By integrating analytical, pharmacological, and clinical data, each of which has limitations, a high level of confidence can be reached regarding clinical performance. Although questions and concerns about the biosimilars pathway remain and may slow uptake, a robust scientific program has been put in place. With three biosimilars already licensed and numerous development programs under way, clinicians can expect to see many new biosimilars come onto the US market in the coming decade. [Note added in proof: Since the writing of this article, a fourth biosimilar has been approved.].

  15. Industrial Internet of Things-Based Collaborative Sensing Intelligence: Framework and Research Challenges.

    PubMed

    Chen, Yuanfang; Lee, Gyu Myoung; Shu, Lei; Crespi, Noel

    2016-02-06

    The development of an efficient and cost-effective solution to solve a complex problem (e.g., dynamic detection of toxic gases) is an important research issue in the industrial applications of the Internet of Things (IoT). An industrial intelligent ecosystem enables the collection of massive data from the various devices (e.g., sensor-embedded wireless devices) dynamically collaborating with humans. Effectively collaborative analytics based on the collected massive data from humans and devices is quite essential to improve the efficiency of industrial production/service. In this study, we propose a collaborative sensing intelligence (CSI) framework, combining collaborative intelligence and industrial sensing intelligence. The proposed CSI facilitates the cooperativity of analytics with integrating massive spatio-temporal data from different sources and time points. To deploy the CSI for achieving intelligent and efficient industrial production/service, the key challenges and open issues are discussed, as well.

  16. Industrial Internet of Things-Based Collaborative Sensing Intelligence: Framework and Research Challenges

    PubMed Central

    Chen, Yuanfang; Lee, Gyu Myoung; Shu, Lei; Crespi, Noel

    2016-01-01

    The development of an efficient and cost-effective solution to solve a complex problem (e.g., dynamic detection of toxic gases) is an important research issue in the industrial applications of the Internet of Things (IoT). An industrial intelligent ecosystem enables the collection of massive data from the various devices (e.g., sensor-embedded wireless devices) dynamically collaborating with humans. Effectively collaborative analytics based on the collected massive data from humans and devices is quite essential to improve the efficiency of industrial production/service. In this study, we propose a collaborative sensing intelligence (CSI) framework, combining collaborative intelligence and industrial sensing intelligence. The proposed CSI facilitates the cooperativity of analytics with integrating massive spatio-temporal data from different sources and time points. To deploy the CSI for achieving intelligent and efficient industrial production/service, the key challenges and open issues are discussed, as well. PMID:26861345

  17. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  18. From an Analytical Framework for Understanding the Innovation Process in Higher Education to an Emerging Research Field of Innovations in Higher Education

    ERIC Educational Resources Information Center

    Cai, Yuzhuo

    2017-01-01

    While studies dealing with issues related to innovations in higher education proliferate, there has been little consensus on key concepts and central issues for research. To respond to the challenges, this paper calls for developing a new research field--studies on innovations in higher education, by integrating two disciplines, namely innovation…

  19. Integrating the analytic hierarchy process and importance-performance analysis into ISO 14001 framework for assessing campus sustainability

    NASA Astrophysics Data System (ADS)

    Pramono, Susatyo N. W.; Ulkhaq, M. Mujiya; Trianto, Reza; Setiowati, Priska R.; Rasyida, Dyah R.; Setyorini, Nadia A.; Jauhari, Wakhid A.

    2017-11-01

    There has been an international emerging issue in the role of higher education in promoting sustainability due to numerous declarations and commitments related to the need of sustainability in higher education. As a result, there is an increasing number of higher educations that have embarked on projects and initiatives to incorporate sustainability into their systems. Higher educations could implement ISO 14001 framework that is recognized as a guide for an organization which aim to implement an environmental management system to pursue the sustainability. This research tried to attempt an extension of the previous work in assessing campus sustainability using ISO 14001 framework by integrating the analytic hierarchy process and importance-performance analysis (IPA). The inclusion of IPA is because many organizations are constrained by limitations on the resources they have so that it has to be decided how those limited resources are best deployed to attain the goals to be achieved. The self-assessment scores of ISO 14001 would the performance and the AHP result is the importance part of the IPA. A case study is conducted at the Diponegoro University, which is located in Semarang, Indonesia. The result indicates that only two main elements of ISO 14001 are located in the second quadrant of IPA, i.e. high performance and high importance. The result also could be a basis for the university to identify, prioritize, and improve the programs related to sustainability and ensure that valuable resources are allocated in the most effective areas.

  20. New York State energy-analytic information system: first-stage implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allentuck, J.; Carroll, O.; Fiore, L.

    1979-09-01

    So that energy policy by state government may be formulated within the constraints imposed by policy determined at the national level - yet reflect the diverse interests of its citizens - large quantities of data and sophisticated analytic capabilities are required. This report presents the design of an energy-information/analytic system for New York State, the data for a base year, 1976, and projections of these data. At the county level, 1976 energy-supply demand data and electric generating plant data are provided as well. Data-base management is based on System 2000. Three computerized models provide the system's basic analytic capacity. Themore » Brookhaven Energy System Network Simulator provides an integrating framework while a price-response model and a weather sensitive energy demand model furnished a short-term energy response estimation capability. The operation of these computerized models is described. 62 references, 25 figures, 39 tables.« less

  1. A systematic review of socio-economic assessments in support of coastal zone management (1992-2011).

    PubMed

    Le Gentil, Eric; Mongruel, Rémi

    2015-02-01

    Cooperation between the social and natural sciences has become essential in order to encompass all the dimensions of coastal zone management. Socio-economic approaches are increasingly recommended to complement integrated assessment in support of these initiatives. A systematic review of the academic literature was carried out in order to analyze the main types of socio-economic assessments used to inform the coastal zone management process as well as their effectiveness. A corpus of 1682 articles published between 1992 and 2011 was identified by means of the representative coverage approach, from which 170 were selected by applying inclusion/exclusion criteria and then classified using a content analysis methodology. The percentage of articles that mention the use of socio-economic assessment in support of coastal zone management initiatives is increasing but remains relatively low. The review examines the links between the issues addressed by integrated assessments and the chosen analytical frameworks as well as the various economic assessment methods which are used in the successive steps of the coastal zone management process. The results show that i) analytical frameworks such as 'risk and vulnerability', 'DPSIR', 'valuation', 'ecosystem services' and 'preferences' are likely to lead to effective integration of social sciences in coastal zone management research while 'integration', 'sustainability' and 'participation' remain difficult to operationalize, ii) risk assessments are insufficiently implemented in developing countries, and iii) indicator systems in support of multi-criteria analyses could be used during more stages of the coastal zone management process. Finally, it is suggested that improved collaboration between science and management would require that scientists currently involved in coastal zone management processes further educate themselves in integrated assessment approaches and participatory methodologies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK

    ERIC Educational Resources Information Center

    Rienties, Bart; Boroowa, Avinash; Cross, Simon; Kubiak, Chris; Mayles, Kevin; Murphy, Sam

    2016-01-01

    There is an urgent need to develop an evidence-based framework for learning analytics whereby stakeholders can manage, evaluate, and make decisions about which types of interventions work well and under which conditions. In this article, we will work towards developing a foundation of an Analytics4Action Evaluation Framework (A4AEF) that is…

  3. ToxPi Graphical User Interface 2.0: Dynamic exploration, visualization, and sharing of integrated data models.

    PubMed

    Marvel, Skylar W; To, Kimberly; Grimm, Fabian A; Wright, Fred A; Rusyn, Ivan; Reif, David M

    2018-03-05

    Drawing integrated conclusions from diverse source data requires synthesis across multiple types of information. The ToxPi (Toxicological Prioritization Index) is an analytical framework that was developed to enable integration of multiple sources of evidence by transforming data into integrated, visual profiles. Methodological improvements have advanced ToxPi and expanded its applicability, necessitating a new, consolidated software platform to provide functionality, while preserving flexibility for future updates. We detail the implementation of a new graphical user interface for ToxPi (Toxicological Prioritization Index) that provides interactive visualization, analysis, reporting, and portability. The interface is deployed as a stand-alone, platform-independent Java application, with a modular design to accommodate inclusion of future analytics. The new ToxPi interface introduces several features, from flexible data import formats (including legacy formats that permit backward compatibility) to similarity-based clustering to options for high-resolution graphical output. We present the new ToxPi interface for dynamic exploration, visualization, and sharing of integrated data models. The ToxPi interface is freely-available as a single compressed download that includes the main Java executable, all libraries, example data files, and a complete user manual from http://toxpi.org .

  4. An Introduction to MAMA (Meta-Analysis of MicroArray data) System.

    PubMed

    Zhang, Zhe; Fenstermacher, David

    2005-01-01

    Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.

  5. Episodic Laryngeal Breathing Disorders: Literature Review and Proposal of Preliminary Theoretical Framework.

    PubMed

    Shembel, Adrianna C; Sandage, Mary J; Verdolini Abbott, Katherine

    2017-01-01

    The purposes of this literature review were (1) to identify and assess frameworks for clinical characterization of episodic laryngeal breathing disorders (ELBD) and their subtypes, (2) to integrate concepts from these frameworks into a novel theoretical paradigm, and (3) to provide a preliminary algorithm to classify clinical features of ELBD for future study of its clinical manifestations and underlying pathophysiological mechanisms. This is a literature review. Peer-reviewed literature from 1983 to 2015 pertaining to models for ELBD was searched using Pubmed, Ovid, Proquest, Cochrane Database of Systematic Reviews, and Google Scholar. Theoretical models for ELBD were identified, evaluated, and integrated into a novel comprehensive framework. Consensus across three salient models provided a working definition and inclusionary criteria for ELBD within the new framework. Inconsistencies and discrepancies within the models provided an analytic platform for future research. Comparison among three conceptual models-(1) Irritable larynx syndrome, (2) Dichotomous triggers, and (3) Periodic occurrence of laryngeal obstruction-showed that the models uniformly consider ELBD to involve episodic laryngeal obstruction causing dyspnea. The models differed in their description of source of dyspnea, in their inclusion of corollary behaviors, in their inclusion of other laryngeal-based behaviors (eg, cough), and types of triggers. The proposed integrated theoretical framework for ELBD provides a preliminary systematic platform for the identification of key clinical feature patterns indicative of ELBD and associated clinical subgroups. This algorithmic paradigm should evolve with better understanding of this spectrum of disorders and its underlying pathophysiological mechanisms. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  6. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    NASA Astrophysics Data System (ADS)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  7. Integral Equations in Computational Electromagnetics: Formulations, Properties and Isogeometric Analysis

    NASA Astrophysics Data System (ADS)

    Lovell, Amy Elizabeth

    Computational electromagnetics (CEM) provides numerical methods to simulate electromagnetic waves interacting with its environment. Boundary integral equation (BIE) based methods, that solve the Maxwell's equations in the homogeneous or piecewise homogeneous medium, are both efficient and accurate, especially for scattering and radiation problems. Development and analysis electromagnetic BIEs has been a very active topic in CEM research. Indeed, there are still many open problems that need to be addressed or further studied. A short and important list includes (1) closed-form or quasi-analytical solutions to time-domain integral equations, (2) catastrophic cancellations at low frequencies, (3) ill-conditioning due to high mesh density, multi-scale discretization, and growing electrical size, and (4) lack of flexibility due to re-meshing when increasing number of forward numerical simulations are involved in the electromagnetic design process. This dissertation will address those several aspects of boundary integral equations in computational electromagnetics. The first contribution of the dissertation is to construct quasi-analytical solutions to time-dependent boundary integral equations using a direct approach. Direct inverse Fourier transform of the time-harmonic solutions is not stable due to the non-existence of the inverse Fourier transform of spherical Hankel functions. Using new addition theorems for the time-domain Green's function and dyadic Green's functions, time-domain integral equations governing transient scattering problems of spherical objects are solved directly and stably for the first time. Additional, the direct time-dependent solutions, together with the newly proposed time-domain dyadic Green's functions, can enrich the time-domain spherical multipole theory. The second contribution is to create a novel method of moments (MoM) framework to solve electromagnetic boundary integral equation on subdivision surfaces. The aim is to avoid the meshing and re-meshing stages to accelerate the design process when the geometry needs to be updated. Two schemes to construct basis functions on the subdivision surface have been explored. One is to use the div-conforming basis function, and the other one is to create a rigorous iso-geometric approach based on the subdivision basis function with better smoothness properties. This new framework provides us better accuracy, more stability and high flexibility. The third contribution is a new stable integral equation formulation to avoid catastrophic cancellations due to low-frequency breakdown or dense-mesh breakdown. Many of the conventional integral equations and their associated post-processing operations suffer from numerical catastrophic cancellations, which can lead to ill-conditioning of the linear systems or serious accuracy problems. Examples includes low-frequency breakdown and dense mesh breakdown. Another instability may come from nontrivial null spaces of involving integral operators that might be related with spurious resonance or topology breakdown. This dissertation presents several sets of new boundary integral equations and studies their analytical properties. The first proposed formulation leads to the scalar boundary integral equations where only scalar unknowns are involved. Besides the requirements of gaining more stability and better conditioning in the resulting linear systems, multi-physics simulation is another driving force for new formulations. Scalar and vector potentials (rather than electromagnetic field) based formulation have been studied for this purpose. Those new contributions focus on different stages of boundary integral equations in an almost independent manner, e.g. isogeometric analysis framework can be used to solve different boundary integral equations, and the time-dependent solutions to integral equations from different formulations can be achieved through the same methodology proposed.

  8. Cultural unconscious in research: integrating multicultural and depth paradigms in qualitative research.

    PubMed

    Yakushko, Oksana; Miles, Pekti; Rajan, Indhushree; Bujko, Biljana; Thomas, Douglas

    2016-11-01

    Culturally focused research has gained momentum in many disciplines, including psychology. However, much of this research fails to pay attention to the unconscious dynamics that underlie the study of culture and culturally influenced human beings. Such dynamics may be especially significant when issues of marginalization and oppression are present. Therefore, this paper seeks to contribute a framework for understanding cultural dynamics, especially unconscious cultural dynamics, within depth psychological qualitative research influenced by Jungian and post-Jungian scholarship. Inquiry that is approached with a commitment to making the unconscious conscious seeks to empower and liberate not only the subject/object studied but also the researchers themselves. Following a brief review of multiculturalism in the context of analytically informed psychology, this paper offers several case examples that focus on researchers' integration of awareness of the cultural unconscious in their study of cultural beings and topics. © 2016, The Society of Analytical Psychology.

  9. A new analytical framework of 'continuum of prevention and care' to maximize HIV case detection and retention in care in Vietnam

    PubMed Central

    2012-01-01

    Background The global initiative ‘Treatment 2.0’ calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. Methods We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Results Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Conclusions Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the ‘Treatment 2.0’ initiative. PMID:23272730

  10. A new analytical framework of 'continuum of prevention and care' to maximize HIV case detection and retention in care in Vietnam.

    PubMed

    Fujita, Masami; Poudel, Krishna C; Do, Thi Nhan; Bui, Duc Duong; Nguyen, Van Kinh; Green, Kimberly; Nguyen, Thi Minh Thu; Kato, Masaya; Jacka, David; Cao, Thi Thanh Thuy; Nguyen, Thanh Long; Jimba, Masamine

    2012-12-29

    The global initiative 'Treatment 2.0' calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the 'Treatment 2.0' initiative.

  11. 77 FR 47767 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security U.S. Customs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-10

    ... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY: Privacy Office... Homeland Security/U.S. Customs and Border Protection, DHS/CBP--017 Analytical Framework for Intelligence... Analytical Framework for Intelligence (AFI) System of Records'' from one or more provisions of the Privacy...

  12. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  13. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  14. A Scalable Data Integration and Analysis Architecture for Sensor Data of Pediatric Asthma.

    PubMed

    Stripelis, Dimitris; Ambite, José Luis; Chiang, Yao-Yi; Eckel, Sandrah P; Habre, Rima

    2017-04-01

    According to the Centers for Disease Control, in the United States there are 6.8 million children living with asthma. Despite the importance of the disease, the available prognostic tools are not sufficient for biomedical researchers to thoroughly investigate the potential risks of the disease at scale. To overcome these challenges we present a big data integration and analysis infrastructure developed by our Data and Software Coordination and Integration Center (DSCIC) of the NIBIB-funded Pediatric Research using Integrated Sensor Monitoring Systems (PRISMS) program. Our goal is to help biomedical researchers to efficiently predict and prevent asthma attacks. The PRISMS-DSCIC is responsible for collecting, integrating, storing, and analyzing real-time environmental, physiological and behavioral data obtained from heterogeneous sensor and traditional data sources. Our architecture is based on the Apache Kafka, Spark and Hadoop frameworks and PostgreSQL DBMS. A main contribution of this work is extending the Spark framework with a mediation layer, based on logical schema mappings and query rewriting, to facilitate data analysis over a consistent harmonized schema. The system provides both batch and stream analytic capabilities over the massive data generated by wearable and fixed sensors.

  15. A tiered, integrated biological and chemical monitoring framework for contaminants of emerging concern in aquatic ecosystems.

    PubMed

    Maruya, Keith A; Dodder, Nathan G; Mehinto, Alvine C; Denslow, Nancy D; Schlenk, Daniel; Snyder, Shane A; Weisberg, Stephen B

    2016-07-01

    The chemical-specific risk-based paradigm that informs monitoring and assessment of environmental contaminants does not apply well to the many thousands of new chemicals that are being introduced into ambient receiving waters. We propose a tiered framework that incorporates bioanalytical screening tools and diagnostic nontargeted chemical analysis to more effectively monitor for contaminants of emerging concern (CECs). The framework is based on a comprehensive battery of in vitro bioassays to first screen for a broad spectrum of CECs and nontargeted analytical methods to identify bioactive contaminants missed by the currently favored targeted analyses. Water quality managers in California have embraced this strategy with plans to further develop and test this framework in regional and statewide pilot studies on waterbodies that receive discharge from municipal wastewater treatment plants and stormwater runoff. In addition to directly informing decisions, the data obtained using this framework can be used to construct and validate models that better predict CEC occurrence and toxicity. The adaptive interplay among screening results, diagnostic assessment and predictive modeling will allow managers to make decisions based on the most current and relevant information, instead of extrapolating from parameters with questionable linkage to CEC impacts. Integr Environ Assess Manag 2016;12:540-547. © 2015 SETAC. © 2015 SETAC.

  16. Spacecraft drag-free technology development: On-board estimation and control synthesis

    NASA Technical Reports Server (NTRS)

    Key, R. W.; Mettler, E.; Milman, M. H.; Schaechter, D. B.

    1982-01-01

    Estimation and control methods for a Drag-Free spacecraft are discussed. The functional and analytical synthesis of on-board estimators and controllers for an integrated attitude and translation control system is represented. The framework for detail definition and design of the baseline drag-free system is created. The techniques for solution of self-gravity and electrostatic charging problems are applicable generally, as is the control system development.

  17. The Emergence of Compositional Communication in a Synthetic Ethology Framework

    DTIC Science & Technology

    2005-08-12

    34Integrating Language and Cognition: A Cognitive Robotics Approach", invited contribution to IEEE Computational Intelligence Magazine . The first two...papers address the main topic of investigation of the research proposal. In particular, we have introduced a simple structured meaning-signal mapping...Cavalli-Sforza (1982) to investigate analytically the evolution of structured com- munication codes. Let x 6 [0,1] be the proportion of individuals in a

  18. Controlling air pollution in a city: A perspective from SOAR-PESTLE analysis.

    PubMed

    Gheibi, Mohammad; Karrabi, Mohsen; Mohammadi, Ali; Dadvar, Azin

    2018-04-16

    Strengths, opportunities, aspirations, and results (SOAR) analysis is a strategic planning framework that helps organizations focus on their current strengths and opportunities to create a vision of future aspirations and the results they will bring. PESTLE is an analytical framework for understanding external influences on a business. This research paper describes a field study and interviews of city hall managers from the city of Mashhad, Iran, conducted to investigate the application of SOAR and PESTLE frameworks for managing Mashhad's air pollution. Strategies are prioritized by the technique for order of preference by similarity to ideal solution (TOPSIS), Shannon entropy (SE), and analytic network process (ANP) multicriteria decision-making (MCDM) methods, considering economic conditions, managers' opinions, consensus, city council approvals, and national documents. The results of this research study show that creating centralized databases, supporting local governments, and developing smart city infrastructure, with weights of 0.194, 0.182, and 0.161, respectively, are the highest ranked strategies for managing air pollution in Mashhad. It can also be concluded that citizen involvement is key to achieving success in the employment of any management strategy. Integr Environ Assess Manag 2018;00:000-000. © 2018 SETAC. © 2018 SETAC.

  19. Integrated pathway-based transcription regulation network mining and visualization based on gene expression profiles.

    PubMed

    Kibinge, Nelson; Ono, Naoaki; Horie, Masafumi; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Saito, Akira; Kanaya, Shigehiko

    2016-06-01

    Conventionally, workflows examining transcription regulation networks from gene expression data involve distinct analytical steps. There is a need for pipelines that unify data mining and inference deduction into a singular framework to enhance interpretation and hypotheses generation. We propose a workflow that merges network construction with gene expression data mining focusing on regulation processes in the context of transcription factor driven gene regulation. The pipeline implements pathway-based modularization of expression profiles into functional units to improve biological interpretation. The integrated workflow was implemented as a web application software (TransReguloNet) with functions that enable pathway visualization and comparison of transcription factor activity between sample conditions defined in the experimental design. The pipeline merges differential expression, network construction, pathway-based abstraction, clustering and visualization. The framework was applied in analysis of actual expression datasets related to lung, breast and prostrate cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. A proposed framework for the interpretation of biomonitoring data

    PubMed Central

    Boogaard, Peter J; Money, Chris D

    2008-01-01

    Biomonitoring, the determination of chemical substances in human body fluids or tissues, is more and more frequently applied. At the same time detection limits are decreasing steadily. As a consequence, many data with potential relevance for public health are generated although they need not necessarily allow interpretation in term of health relevance. The European Centre of Ecotoxicology and Toxicology of Chemicals (ECETOC) formed a dedicated task force to build a framework for the interpretation of biomonitoring data. The framework that was developed evaluates biomonitoring data based on their analytical integrity, their ability to describe dose (toxicokinetics), their ability to relate to effects, and an overall evaluation and weight of evidence analysis. This framework was subsequently evaluated with a number of case studies and was shown to provide a rational basis to advance discussions on human biomonitoring allowing better use and application of this type of data in human health risk assessment. PMID:18541066

  1. Critical thinking: a two-phase framework.

    PubMed

    Edwards, Sharon L

    2007-09-01

    This article provides a comprehensive review of how a two-phase framework can promote and engage nurses in the concepts of critical thinking. Nurse education is required to integrate critical thinking in their teaching strategies, as it is widely recognised as an important part of student nurses becoming analytical qualified practitioners. The two-phase framework can be incorporated in the classroom using enquiry-based scenarios or used to investigate situations that arise from practice, for reflection, analysis, theorising or to explore issues. This paper proposes a two-phase framework for incorporation in the classroom and practice to promote critical thinking. Phase 1 attempts to make it easier for nurses to organise and expound often complex and abstract ideas that arise when using critical thinking, identify more than one solution to the problem by using a variety of cues to facilitate action. Phase 2 encourages nurses to be accountable and responsible, to justify a decision, be creative and innovative in implementing change.

  2. Analyzing Data Generated Through Deliberative Dialogue: Bringing Knowledge Translation Into Qualitative Analysis.

    PubMed

    Plamondon, Katrina M; Bottorff, Joan L; Cole, Donald C

    2015-11-01

    Deliberative dialogue (DD) is a knowledge translation strategy that can serve to generate rich data and bridge health research with action. An intriguing alternative to other modes of generating data, the purposeful and evidence-informed conversations characteristic of DD generate data inclusive of collective interpretations. These data are thus dialogic, presenting complex challenges for qualitative analysis. In this article, we discuss the nature of data generated through DD, orienting ourselves toward a theoretically grounded approach to analysis. We offer an integrated framework for analysis, balancing analytical strategies of categorizing and connecting with the use of empathetic and suspicious interpretive lenses. In this framework, data generation and analysis occur in concert, alongside engaging participants and synthesizing evidence. An example of application is provided, demonstrating nuances of the framework. We conclude with reflections on the strengths and limitations of the framework, suggesting how it may be relevant in other qualitative health approaches. © The Author(s) 2015.

  3. Back-support large laser mirror unit: mounting modeling and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Zhang, Zheng; Long, Kai; Liu, Tianye; Li, Jun; Liu, Changchun; Xiong, Zhao; Yuan, Xiaodong

    2018-01-01

    In high-power laser system, the surface wavefront of large optics has a close link with its structure design and mounting method. The back-support transport mirror design is presently being investigated as a means in China's high-power laser system to hold the optical component firmly while minimizing the distortion of its reflecting surface. We have proposed a comprehensive analytical framework integrated numerical modeling and precise metrology for the mirror's mounting performance evaluation while treating the surface distortion as a key decision variable. The combination of numerical simulation and field tests demonstrates that the comprehensive analytical framework provides a detailed and accurate approach to evaluate the performance of the transport mirror. It is also verified that the back-support transport mirror is effectively compatible with state-of-the-art optical quality specifications. This study will pave the way for future research to solidify the design of back-support large laser optics in China's next generation inertial confinement fusion facility.

  4. An analytical framework for whole-genome sequence association studies and its implications for autism spectrum disorder.

    PubMed

    Werling, Donna M; Brand, Harrison; An, Joon-Yong; Stone, Matthew R; Zhu, Lingxue; Glessner, Joseph T; Collins, Ryan L; Dong, Shan; Layer, Ryan M; Markenscoff-Papadimitriou, Eirene; Farrell, Andrew; Schwartz, Grace B; Wang, Harold Z; Currall, Benjamin B; Zhao, Xuefang; Dea, Jeanselle; Duhn, Clif; Erdman, Carolyn A; Gilson, Michael C; Yadav, Rachita; Handsaker, Robert E; Kashin, Seva; Klei, Lambertus; Mandell, Jeffrey D; Nowakowski, Tomasz J; Liu, Yuwen; Pochareddy, Sirisha; Smith, Louw; Walker, Michael F; Waterman, Matthew J; He, Xin; Kriegstein, Arnold R; Rubenstein, John L; Sestan, Nenad; McCarroll, Steven A; Neale, Benjamin M; Coon, Hilary; Willsey, A Jeremy; Buxbaum, Joseph D; Daly, Mark J; State, Matthew W; Quinlan, Aaron R; Marth, Gabor T; Roeder, Kathryn; Devlin, Bernie; Talkowski, Michael E; Sanders, Stephan J

    2018-05-01

    Genomic association studies of common or rare protein-coding variation have established robust statistical approaches to account for multiple testing. Here we present a comparable framework to evaluate rare and de novo noncoding single-nucleotide variants, insertion/deletions, and all classes of structural variation from whole-genome sequencing (WGS). Integrating genomic annotations at the level of nucleotides, genes, and regulatory regions, we define 51,801 annotation categories. Analyses of 519 autism spectrum disorder families did not identify association with any categories after correction for 4,123 effective tests. Without appropriate correction, biologically plausible associations are observed in both cases and controls. Despite excluding previously identified gene-disrupting mutations, coding regions still exhibited the strongest associations. Thus, in autism, the contribution of de novo noncoding variation is probably modest in comparison to that of de novo coding variants. Robust results from future WGS studies will require large cohorts and comprehensive analytical strategies that consider the substantial multiple-testing burden.

  5. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  6. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felberg, Lisa E.; Brookes, David H.; Yap, Eng-Hui

    2016-11-02

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized Poisson Boltzmann equation. The PB-AM software package includes the generation of outputs files appropriate for visualization using VMD, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmannmore » Solver (APBS) software package to make it more accessible to a larger group of scientists, educators and students that are more familiar with the APBS framework.« less

  7. Multiaxis sensing using metal organic frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talin, Albert Alec; Allendorf, Mark D.; Leonard, Francois

    2017-01-17

    A sensor device including a sensor substrate; and a thin film comprising a porous metal organic framework (MOF) on the substrate that presents more than one transduction mechanism when exposed to an analyte. A method including exposing a porous metal organic framework (MOF) on a substrate to an analyte; and identifying more than one transduction mechanism in response to the exposure to the analyte.

  8. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    ERIC Educational Resources Information Center

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  9. The full spectrum of climate change adaptation: testing an analytical framework in Tyrolean mountain agriculture (Austria).

    PubMed

    Grüneis, Heidelinde; Penker, Marianne; Höferl, Karl-Michael

    2016-01-01

    Our scientific view on climate change adaptation (CCA) is unsatisfying in many ways: It is often dominated by a modernistic perspective of planned pro-active adaptation, with a selective focus on measures directly responding to climate change impacts and thus it is far from real-life conditions of those who are actually affected by climate change. Farmers have to simultaneously adapt to multiple changes. Therefore, also empirical climate change adaptation research needs a more integrative perspective on real-life climate change adaptations. This also has to consider "hidden" adaptations, which are not explicitly and directly motivated by CCA but actually contribute to the sector's adaptability to climate change. The aim of the present study is to develop and test an analytic framework that contributes to a broader understanding of CCA and to bridge the gap between scientific expertise and practical action. The framework distinguishes three types of CCA according to their climate related motivations: explicit adaptations, multi-purpose adaptations, and hidden adaptations. Although agriculture is among the sectors that are most affected by climate change, results from the case study of Tyrolean mountain agriculture show that climate change is ranked behind other more pressing "real-life-challenges" such as changing agricultural policies or market conditions. We identified numerous hidden adaptations which make a valuable contribution when dealing with climate change impacts. We conclude that these hidden adaptations have not only to be considered to get an integrative und more realistic view on CCA; they also provide a great opportunity for linking adaptation strategies to farmers' realities.

  10. Uncertainty evaluation of nuclear reaction model parameters using integral and microscopic measurements. Covariances evaluation with CONRAD code

    NASA Astrophysics Data System (ADS)

    de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.

    2010-10-01

    In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.

  11. Nonlinear viscoelastic characterization of structural adhesives

    NASA Technical Reports Server (NTRS)

    Rochefort, M. A.; Brinson, H. F.

    1983-01-01

    Measurements of the nonliner viscoelastic behavior of two adhesives, FM-73 and FM-300, are presented and discussed. Analytical methods to quantify the measurements are given and fitted into a framework of an accelerated testing and analysis procedure. The single integral model used is shown to function well and is analogous to a time-temperature stress-superposition procedure (TTSSP). Advantages and disadvantages of the creep power law method used in this study are given.

  12. Extracting Effective Higgs Couplings in the Golden Channel

    DOE PAGES

    Chen, Yi; Vega-Morales, Roberto

    2014-04-08

    Kinematic distributions in Higgs decays to four charged leptons, the so called ‘golden channel, are a powerful probe of the tensor structure of its couplings to neutral electroweak gauge bosons. In this study we construct the first part of a comprehensive analysis framework designed to maximize the information contained in this channel in order to perform direct extraction of the various possible Higgs couplings. We first complete an earlier analytic calculation of the leading order fully differential cross sections for the golden channel signal and background to include the 4e and 4μ final states with interference between identical final states.more » We also examine the relative fractions of the different possible combinations of scalar-tensor couplings by integrating the fully differential cross section over all kinematic variables as well as show various doubly differential spectra for both the signal and background. From these analytic expressions we then construct a ‘generator level’ analysis framework based on the maximum likelihood method. Then, we demonstrate the ability of our framework to perform multi-parameter extractions of all the possible effective couplings of a spin-0 scalar to pairs of neutral electroweak gauge bosons including any correlations. Furthermore, this framework provides a powerful method for study of these couplings and can be readily adapted to include the relevant detector and systematic effects which we demonstrate in an accompanying study to follow.« less

  13. Integrating uncertainty into public energy research and development decisions

    NASA Astrophysics Data System (ADS)

    Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina

    2017-05-01

    Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.

  14. xQuake: A Modern Approach to Seismic Network Analytics

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.; Aikin, K. E.

    2017-12-01

    While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.

  15. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  16. Comparative SWOT analysis of strategic environmental assessment systems in the Middle East and North Africa region.

    PubMed

    Rachid, G; El Fadel, M

    2013-08-15

    This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Pattern search in multi-structure data: a framework for the next-generation evidence-based medicine

    NASA Astrophysics Data System (ADS)

    Sukumar, Sreenivas R.; Ainsworth, Keela C.

    2014-03-01

    With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. Addressing this need, we pose and answer the following questions: (i) How can we jointly analyze and explore measurement data in context with qualitative domain knowledge? (ii) How can we search and hypothesize patterns (not known apriori) from such multi-structure data? (iii) How can we build predictive models by integrating weakly-associated multi-relational multi-structure data? We propose a framework towards answering these questions. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  18. Lectindb: a plant lectin database.

    PubMed

    Chandra, Nagasuma R; Kumar, Nirmal; Jeyakani, Justin; Singh, Desh Deepak; Gowda, Sharan B; Prathima, M N

    2006-10-01

    Lectins, a class of carbohydrate-binding proteins, are now widely recognized to play a range of crucial roles in many cell-cell recognition events triggering several important cellular processes. They encompass different members that are diverse in their sequences, structures, binding site architectures, quaternary structures, carbohydrate affinities, and specificities as well as their larger biological roles and potential applications. It is not surprising, therefore, that the vast amount of experimental data on lectins available in the literature is so diverse, that it becomes difficult and time consuming, if not impossible to comprehend the advances in various areas and obtain the maximum benefit. To achieve an effective use of all the data toward understanding the function and their possible applications, an organization of these seemingly independent data into a common framework is essential. An integrated knowledge base ( Lectindb, http://nscdb.bic.physics.iisc.ernet.in ) together with appropriate analytical tools has therefore been developed initially for plant lectins by collating and integrating diverse data. The database has been implemented using MySQL on a Linux platform and web-enabled using PERL-CGI and Java tools. Data for each lectin pertain to taxonomic, biochemical, domain architecture, molecular sequence, and structural details as well as carbohydrate and hence blood group specificities. Extensive links have also been provided for relevant bioinformatics resources and analytical tools. Availability of diverse data integrated into a common framework is expected to be of high value not only for basic studies in lectin biology but also for basic studies in pursuing several applications in biotechnology, immunology, and clinical practice, using these molecules.

  19. Analytical mass formula and nuclear surface properties in the ETF approximation. Part II: asymmetric nuclei

    NASA Astrophysics Data System (ADS)

    Aymard, François; Gulminelli, Francesca; Margueron, Jérôme

    2016-08-01

    We have recently addressed the problem of the determination of the nuclear surface energy for symmetric nuclei in the framework of the extended Thomas-Fermi (ETF) approximation using Skyrme functionals. We presently extend this formalism to the case of asymmetric nuclei and the question of the surface symmetry energy. We propose an approximate expression for the diffuseness and the surface energy. These quantities are analytically related to the parameters of the energy functional. In particular, the influence of the different equation of state parameters can be explicitly quantified. Detailed analyses of the different energy components (local/non-local, isoscalar/isovector, surface/curvature and higher order) are also performed. Our analytical solution of the ETF integral improves previous models and leads to a precision of better than 200 keV per nucleon in the determination of the nuclear binding energy for dripline nuclei.

  20. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

    PubMed

    Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

    2017-06-05

    We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. A proposed analytic framework for determining the impact of an antimicrobial resistance intervention.

    PubMed

    Grohn, Yrjo T; Carson, Carolee; Lanzas, Cristina; Pullum, Laura; Stanhope, Michael; Volkova, Victoriya

    2017-06-01

    Antimicrobial use (AMU) is increasingly threatened by antimicrobial resistance (AMR). The FDA is implementing risk mitigation measures promoting prudent AMU in food animals. Their evaluation is crucial: the AMU/AMR relationship is complex; a suitable framework to analyze interventions is unavailable. Systems science analysis, depicting variables and their associations, would help integrate mathematics/epidemiology to evaluate the relationship. This would identify informative data and models to evaluate interventions. This National Institute for Mathematical and Biological Synthesis AMR Working Group's report proposes a system framework to address the methodological gap linking livestock AMU and AMR in foodborne bacteria. It could evaluate how AMU (and interventions) impact AMR. We will evaluate pharmacokinetic/dynamic modeling techniques for projecting AMR selection pressure on enteric bacteria. We study two methods to model phenotypic AMR changes in bacteria in the food supply and evolutionary genotypic analyses determining molecular changes in phenotypic AMR. Systems science analysis integrates the methods, showing how resistance in the food supply is explained by AMU and concurrent factors influencing the whole system. This process is updated with data and techniques to improve prediction and inform improvements for AMU/AMR surveillance. Our proposed framework reflects both the AMR system's complexity, and desire for simple, reliable conclusions.

  2. EpiK: A Knowledge Base for Epidemiological Modeling and Analytics of Infectious Diseases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, S. M. Shamimul; Fox, Edward A.; Bisset, Keith

    Computational epidemiology seeks to develop computational methods to study the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Recent advances in computing and data sciences have led to the development of innovative modeling environments to support this important goal. The datasets used to drive the dynamic models as well as the data produced by these models presents unique challenges owing to their size, heterogeneity and diversity. These datasets form the basis of effective and easy to use decision support and analytical environments. Asmore » a result, it is important to develop scalable data management systems to store, manage and integrate these datasets. In this paper, we develop EpiK—a knowledge base that facilitates the development of decision support and analytical environments to support epidemic science. An important goal is to develop a framework that links the input as well as output datasets to facilitate effective spatio-temporal and social reasoning that is critical in planning and intervention analysis before and during an epidemic. The data management framework links modeling workflow data and its metadata using a controlled vocabulary. The metadata captures information about storage, the mapping between the linked model and the physical layout, and relationships to support services. EpiK is designed to support agent-based modeling and analytics frameworks—aggregate models can be seen as special cases and are thus supported. We use semantic web technologies to create a representation of the datasets that encapsulates both the location and the schema heterogeneity. The choice of RDF as a representation language is motivated by the diversity and growth of the datasets that need to be integrated. A query bank is developed—the queries capture a broad range of questions that can be posed and answered during a typical case study pertaining to disease outbreaks. The queries are constructed using SPARQL Protocol and RDF Query Language (SPARQL) over the EpiK. EpiK can hide schema and location heterogeneity while efficiently supporting queries that span the computational epidemiology modeling pipeline: from model construction to simulation output. As a result, we show that the performance of benchmark queries varies significantly with respect to the choice of hardware underlying the database and resource description framework (RDF) engine.« less

  3. EpiK: A Knowledge Base for Epidemiological Modeling and Analytics of Infectious Diseases

    DOE PAGES

    Hasan, S. M. Shamimul; Fox, Edward A.; Bisset, Keith; ...

    2017-11-06

    Computational epidemiology seeks to develop computational methods to study the distribution and determinants of health-related states or events (including disease), and the application of this study to the control of diseases and other health problems. Recent advances in computing and data sciences have led to the development of innovative modeling environments to support this important goal. The datasets used to drive the dynamic models as well as the data produced by these models presents unique challenges owing to their size, heterogeneity and diversity. These datasets form the basis of effective and easy to use decision support and analytical environments. Asmore » a result, it is important to develop scalable data management systems to store, manage and integrate these datasets. In this paper, we develop EpiK—a knowledge base that facilitates the development of decision support and analytical environments to support epidemic science. An important goal is to develop a framework that links the input as well as output datasets to facilitate effective spatio-temporal and social reasoning that is critical in planning and intervention analysis before and during an epidemic. The data management framework links modeling workflow data and its metadata using a controlled vocabulary. The metadata captures information about storage, the mapping between the linked model and the physical layout, and relationships to support services. EpiK is designed to support agent-based modeling and analytics frameworks—aggregate models can be seen as special cases and are thus supported. We use semantic web technologies to create a representation of the datasets that encapsulates both the location and the schema heterogeneity. The choice of RDF as a representation language is motivated by the diversity and growth of the datasets that need to be integrated. A query bank is developed—the queries capture a broad range of questions that can be posed and answered during a typical case study pertaining to disease outbreaks. The queries are constructed using SPARQL Protocol and RDF Query Language (SPARQL) over the EpiK. EpiK can hide schema and location heterogeneity while efficiently supporting queries that span the computational epidemiology modeling pipeline: from model construction to simulation output. As a result, we show that the performance of benchmark queries varies significantly with respect to the choice of hardware underlying the database and resource description framework (RDF) engine.« less

  4. Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making - Proceedings of a Workshop

    USGS Publications Warehouse

    Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl

    2010-01-01

    The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and the use of ecosystem services in decision making.

  5. An integral equation-based numerical solver for Taylor states in toroidal geometries

    NASA Astrophysics Data System (ADS)

    O'Neil, Michael; Cerfon, Antoine J.

    2018-04-01

    We present an algorithm for the numerical calculation of Taylor states in toroidal and toroidal-shell geometries using an analytical framework developed for the solution to the time-harmonic Maxwell equations. Taylor states are a special case of what are known as Beltrami fields, or linear force-free fields. The scheme of this work relies on the generalized Debye source representation of Maxwell fields and an integral representation of Beltrami fields which immediately yields a well-conditioned second-kind integral equation. This integral equation has a unique solution whenever the Beltrami parameter λ is not a member of a discrete, countable set of resonances which physically correspond to spontaneous symmetry breaking. Several numerical examples relevant to magnetohydrodynamic equilibria calculations are provided. Lastly, our approach easily generalizes to arbitrary geometries, both bounded and unbounded, and of varying genus.

  6. Cumulative biological impacts framework for solar energy projects in the California Desert

    USGS Publications Warehouse

    Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John

    2013-01-01

    This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.

  7. Knowledge and power in the technology classroom: a framework for studying teachers and students in action

    NASA Astrophysics Data System (ADS)

    Danielsson, Anna T.; Berge, Maria; Lidar, Malena

    2018-03-01

    The purpose of this paper is to develop and illustrate an analytical framework for exploring how relations between knowledge and power are constituted in science and technology classrooms. In addition, the empirical purpose of this paper is to explore how disciplinary knowledge and knowledge-making are constituted in teacher-student interactions. In our analysis we focus on how instances of teacher-student interaction can be understood as simultaneously contributing to meaning-making and producing power relations. The analytical framework we have developed makes use of practical epistemological analysis in combination with a Foucauldian conceptualisation of power, assuming that privileging of educational content needs to be understood as integral to the execution of power in the classroom. The empirical data consists of video-recorded teaching episodes, taken from a teaching sequence of three 1-h lessons in one Swedish technology classroom with sixteen 13-14 years old students. In the analysis we have identified how different epistemological moves contribute to the normalisation and exclusion of knowledge as well as ways of knowledge-making. Further, by looking at how the teacher communicates what counts as (ir)relevant knowledge or (ir)relevant ways of acquiring knowledge we are able to describe what kind of technology student is made desirable in the analysed classroom.

  8. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management

    PubMed Central

    Convertino, Matteo; Valverde, L. James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of threatened and endangered species. The PDA approach demonstrates the advantages of integrated, top-down management, versus bottom-up management approaches. PMID:23823331

  9. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    PubMed

    Convertino, Matteo; Valverde, L James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of threatened and endangered species. The PDA approach demonstrates the advantages of integrated, top-down management, versus bottom-up management approaches.

  10. Development of a category 2 approach system model

    NASA Technical Reports Server (NTRS)

    Johnson, W. A.; Mcruer, D. T.

    1972-01-01

    An analytical model is presented which provides, as its primary output, the probability of a successful Category II approach. Typical applications are included using several example systems (manual and automatic) which are subjected to random gusts and deterministic wind shear. The primary purpose of the approach system model is to establish a structure containing the system elements, command inputs, disturbances, and their interactions in an analytical framework so that the relative effects of changes in the various system elements on precision of control and available margins of safety can be estimated. The model is intended to provide insight for the design and integration of suitable autopilot, display, and navigation elements; and to assess the interaction of such elements with the pilot/copilot.

  11. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    NASA Technical Reports Server (NTRS)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  12. Integration among databases and data sets to support productive nanotechnology: Challenges and recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karcher, Sandra; Willighagen, Egon L.; Rumble, John

    Many groups within the broad field of nanoinformatics are already developing data repositories and analytical tools driven by their individual organizational goals. Integrating these data resources across disciplines and with non-nanotechnology resources can support multiple objectives by enabling the reuse of the same information. Integration can also serve as the impetus for novel scientific discoveries by providing the framework to support deeper data analyses. This article discusses current data integration practices in nanoinformatics and in comparable mature fields, and nanotechnology-specific challenges impacting data integration. Based on results from a nanoinformatics-community-wide survey, recommendations for achieving integration of existing operational nanotechnology resourcesmore » are presented. Nanotechnology-specific data integration challenges, if effectively resolved, can foster the application and validation of nanotechnology within and across disciplines. This paper is one of a series of articles by the Nanomaterial Data Curation Initiative that address data issues such as data curation workflows, data completeness and quality, curator responsibilities, and metadata.« less

  13. Analytic hierarchy process helps select site for limestone quarry expansion in Barbados.

    PubMed

    Dey, Prasanta Kumar; Ramcharan, Eugene K

    2008-09-01

    Site selection is a key activity for quarry expansion to support cement production, and is governed by factors such as resource availability, logistics, costs, and socio-economic-environmental factors. Adequate consideration of all the factors facilitates both industrial productivity and sustainable economic growth. This study illustrates the site selection process that was undertaken for the expansion of limestone quarry operations to support cement production in Barbados. First, alternate sites with adequate resources to support a 25-year development horizon were identified. Second, technical and socio-economic-environmental factors were then identified. Third, a database was developed for each site with respect to each factor. Fourth, a hierarchical model in analytic hierarchy process (AHP) framework was then developed. Fifth, the relative ranking of the alternate sites was then derived through pair wise comparison in all the levels and through subsequent synthesizing of the results across the hierarchy through computer software (Expert Choice). The study reveals that an integrated framework using the AHP can help select a site for the quarry expansion project in Barbados.

  14. Metal-Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform.

    PubMed

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T; Ohodnicki, Paul R

    2018-02-23

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal-organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability of MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2 , N 2 , O 2 , and CO) with rapid (

  15. An integrated WebGIS framework for volunteered geographic information and social media in soil and water conservation.

    PubMed

    Werts, Joshua D; Mikhailova, Elena A; Post, Christopher J; Sharp, Julia L

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  16. An Integrated WebGIS Framework for Volunteered Geographic Information and Social Media in Soil and Water Conservation

    NASA Astrophysics Data System (ADS)

    Werts, Joshua D.; Mikhailova, Elena A.; Post, Christopher J.; Sharp, Julia L.

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  17. Modelling electro-active polymers with a dispersion-type anisotropy

    NASA Astrophysics Data System (ADS)

    Hossain, Mokarram; Steinmann, Paul

    2018-02-01

    We propose a novel constitutive framework for electro-active polymers (EAPs) that can take into account anisotropy with a chain dispersion. To enhance actuation behaviour, particle-filled EAPs become promising candidates nowadays. Recent studies suggest that particle-filled EAPs, which can be cured under an electric field during the manufacturing time, do not necessarily form perfect anisotropic composites, rather they create composites with dispersed chains. Hence in this contribution, an electro-mechanically coupled constitutive model is devised that considers the chain dispersion with a probability distribution function in an integral form. To obtain relevant quantities in discrete form, numerical integration over the unit sphere is utilized. Necessary constitutive equations are derived exploiting the basic laws of thermodynamics that result in a thermodynamically consistent formulation. To demonstrate the performance of the proposed electro-mechanically coupled framework, we analytically solve a non-homogeneous boundary value problem, the extension and inflation of an axisymmetric cylindrical tube under electro-mechanically coupled load. The results capture various electro-mechanical couplings with the formulation proposed for EAP composites.

  18. Physical properties of biological entities: an introduction to the ontology of physics for biology.

    PubMed

    Cook, Daniel L; Bookstein, Fred L; Gennari, John H

    2011-01-01

    As biomedical investigators strive to integrate data and analyses across spatiotemporal scales and biomedical domains, they have recognized the benefits of formalizing languages and terminologies via computational ontologies. Although ontologies for biological entities-molecules, cells, organs-are well-established, there are no principled ontologies of physical properties-energies, volumes, flow rates-of those entities. In this paper, we introduce the Ontology of Physics for Biology (OPB), a reference ontology of classical physics designed for annotating biophysical content of growing repositories of biomedical datasets and analytical models. The OPB's semantic framework, traceable to James Clerk Maxwell, encompasses modern theories of system dynamics and thermodynamics, and is implemented as a computational ontology that references available upper ontologies. In this paper we focus on the OPB classes that are designed for annotating physical properties encoded in biomedical datasets and computational models, and we discuss how the OPB framework will facilitate biomedical knowledge integration. © 2011 Cook et al.

  19. Screening of groundwater remedial alternatives for brownfield sites: a comprehensive method integrated MCDA with numerical simulation.

    PubMed

    Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu

    2018-06-01

    Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.

  20. Using a fuzzy comprehensive evaluation method to determine product usability: A proposed theoretical framework.

    PubMed

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.

  1. Modeling energy/economy interactions for conservation and renewable energy-policy analysis

    NASA Astrophysics Data System (ADS)

    Groncki, P. J.

    Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.

  2. Understanding the micro and macro politics of health: Inequalities, intersectionality & institutions - A research agenda.

    PubMed

    Gkiouleka, Anna; Huijts, Tim; Beckfield, Jason; Bambra, Clare

    2018-03-01

    This essay brings together intersectionality and institutional approaches to health inequalities, suggesting an integrative analytical framework that accounts for the complexity of the intertwined influence of both individual social positioning and institutional stratification on health. This essay therefore advances the emerging scholarship on the relevance of intersectionality to health inequalities research. We argue that intersectionality provides a strong analytical tool for an integrated understanding of health inequalities beyond the purely socioeconomic by addressing the multiple layers of privilege and disadvantage, including race, migration and ethnicity, gender and sexuality. We further demonstrate how integrating intersectionality with institutional approaches allows for the study of institutions as heterogeneous entities that impact on the production of social privilege and disadvantage beyond just socioeconomic (re)distribution. This leads to an understanding of the interaction of the macro and the micro facets of the politics of health. Finally, we set out a research agenda considering the interplay/intersections between individuals and institutions and involving a series of methodological implications for research - arguing that quantitative designs can incorporate an intersectional institutional approach. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. An information driven strategy to support multidisciplinary design

    NASA Technical Reports Server (NTRS)

    Rangan, Ravi M.; Fulton, Robert E.

    1990-01-01

    The design of complex engineering systems such as aircraft, automobiles, and computers is primarily a cooperative multidisciplinary design process involving interactions between several design agents. The common thread underlying this multidisciplinary design activity is the information exchange between the various groups and disciplines. The integrating component in such environments is the common data and the dependencies that exist between such data. This may be contrasted to classical multidisciplinary analyses problems where there is coupling between distinct design parameters. For example, they may be expressed as mathematically coupled relationships between aerodynamic and structural interactions in aircraft structures, between thermal and structural interactions in nuclear plants, and between control considerations and structural interactions in flexible robots. These relationships provide analytical based frameworks leading to optimization problem formulations. However, in multidisciplinary design problems, information based interactions become more critical. Many times, the relationships between different design parameters are not amenable to analytical characterization. Under such circumstances, information based interactions will provide the best integration paradigm, i.e., there is a need to model the data entities and their dependencies between design parameters originating from different design agents. The modeling of such data interactions and dependencies forms the basis for integrating the various design agents.

  4. Gating Mechanisms of Mechanosensitive Channels of Large Conductance, I: A Continuum Mechanics-Based Hierarchical Framework

    PubMed Central

    Chen, Xi; Cui, Qiang; Tang, Yuye; Yoo, Jejoong; Yethiraj, Arun

    2008-01-01

    A hierarchical simulation framework that integrates information from molecular dynamics (MD) simulations into a continuum model is established to study the mechanical response of mechanosensitive channel of large-conductance (MscL) using the finite element method (FEM). The proposed MD-decorated FEM (MDeFEM) approach is used to explore the detailed gating mechanisms of the MscL in Escherichia coli embedded in a palmitoyloleoylphosphatidylethanolamine lipid bilayer. In Part I of this study, the framework of MDeFEM is established. The transmembrane and cytoplasmic helices are taken to be elastic rods, the loops are modeled as springs, and the lipid bilayer is approximated by a three-layer sheet. The mechanical properties of the continuum components, as well as their interactions, are derived from molecular simulations based on atomic force fields. In addition, analytical closed-form continuum model and elastic network model are established to complement the MDeFEM approach and to capture the most essential features of gating. In Part II of this study, the detailed gating mechanisms of E. coli-MscL under various types of loading are presented and compared with experiments, structural model, and all-atom simulations, as well as the analytical models established in Part I. It is envisioned that such a hierarchical multiscale framework will find great value in the study of a variety of biological processes involving complex mechanical deformations such as muscle contraction and mechanotransduction. PMID:18390626

  5. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework.

    PubMed

    Teeguarden, Justin G; Tan, Yu-Mei; Edwards, Stephen W; Leonard, Jeremy A; Anderson, Kim A; Corley, Richard A; Kile, Molly L; Simonich, Staci M; Stone, David; Tanguay, Robert L; Waters, Katrina M; Harper, Stacey L; Williams, David E

    2016-05-03

    Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the "systems approaches" used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences. Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making.

  6. Cumulative risk assessment for combined health effects from chemical and nonchemical stressors.

    PubMed

    Sexton, Ken; Linder, Stephen H

    2011-12-01

    Cumulative risk assessment is a science policy tool for organizing and analyzing information to examine, characterize, and possibly quantify combined threats from multiple environmental stressors. We briefly survey the state of the art regarding cumulative risk assessment, emphasizing challenges and complexities of moving beyond the current focus on chemical mixtures to incorporate nonchemical stressors, such as poverty and discrimination, into the assessment paradigm. Theoretical frameworks for integrating nonchemical stressors into cumulative risk assessments are discussed, the impact of geospatial issues on interpreting results of statistical analyses is described, and four assessment methods are used to illustrate the diversity of current approaches. Prospects for future progress depend on adequate research support as well as development and verification of appropriate analytic frameworks.

  7. Cumulative Risk Assessment for Combined Health Effects From Chemical and Nonchemical Stressors

    PubMed Central

    Linder, Stephen H.

    2011-01-01

    Cumulative risk assessment is a science policy tool for organizing and analyzing information to examine, characterize, and possibly quantify combined threats from multiple environmental stressors. We briefly survey the state of the art regarding cumulative risk assessment, emphasizing challenges and complexities of moving beyond the current focus on chemical mixtures to incorporate nonchemical stressors, such as poverty and discrimination, into the assessment paradigm. Theoretical frameworks for integrating nonchemical stressors into cumulative risk assessments are discussed, the impact of geospatial issues on interpreting results of statistical analyses is described, and four assessment methods are used to illustrate the diversity of current approaches. Prospects for future progress depend on adequate research support as well as development and verification of appropriate analytic frameworks. PMID:21551386

  8. ‘More health for the money’: an analytical framework for access to health care through microfinance and savings groups

    PubMed Central

    Saha, Somen

    2014-01-01

    The main contributors to inequities in health relates to widespread poverty. Health cannot be achieved without addressing the social determinants of health, and the answer does not lie in the health sector alone. One of the potential pathways to address vulnerabilities linked to poverty, social exclusion, and empowerment of women is aligning health programmes with empowerment interventions linked to access to capital through microfinance and self-help groups. This paper presents a framework to analyse combined health and financial interventions through microfinance programmes in reducing barriers to access health care. If properly designed and ethically managed such integrated programmes can provide more health for the money spent on health care. PMID:25364028

  9. 'More health for the money': an analytical framework for access to health care through microfinance and savings groups.

    PubMed

    Saha, Somen

    2014-10-01

    The main contributors to inequities in health relates to widespread poverty. Health cannot be achieved without addressing the social determinants of health, and the answer does not lie in the health sector alone. One of the potential pathways to address vulnerabilities linked to poverty, social exclusion, and empowerment of women is aligning health programmes with empowerment interventions linked to access to capital through microfinance and self-help groups. This paper presents a framework to analyse combined health and financial interventions through microfinance programmes in reducing barriers to access health care. If properly designed and ethically managed such integrated programmes can provide more health for the money spent on health care.

  10. A Discounting Framework for Choice With Delayed and Probabilistic Rewards

    PubMed Central

    Green, Leonard; Myerson, Joel

    2005-01-01

    When choosing between delayed or uncertain outcomes, individuals discount the value of such outcomes on the basis of the expected time to or the likelihood of their occurrence. In an integrative review of the expanding experimental literature on discounting, the authors show that although the same form of hyperbola-like function describes discounting of both delayed and probabilistic outcomes, a variety of recent findings are inconsistent with a single-process account. The authors also review studies that compare discounting in different populations and discuss the theoretical and practical implications of the findings. The present effort illustrates the value of studying choice involving both delayed and probabilistic outcomes within a general discounting framework that uses similar experimental procedures and a common analytical approach. PMID:15367080

  11. 77 FR 33683 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security, U.S. Customs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-07

    ... Border Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY... Framework for Intelligence (AFI) System of Records'' and this proposed rulemaking. In this proposed... Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records.'' AFI enhances DHS's...

  12. Analytical Overview of the European and Russian Qualifications Frameworks with a Focus on Doctoral Degree Level

    ERIC Educational Resources Information Center

    Chigisheva, Oksana; Bondarenko, Anna; Soltovets, Elena

    2017-01-01

    The paper provides analytical insights into highly acute issues concerning preparation and adoption of Qualifications Frameworks being an adequate response to the growing interactions at the global labor market and flourishing of knowledge economy. Special attention is paid to the analyses of transnational Meta Qualifications Frameworks (A…

  13. Degrees of School Democracy: A Holistic Framework

    ERIC Educational Resources Information Center

    Woods, Philip A.; Woods, Glenys J.

    2012-01-01

    This article outlines an analytical framework that enables analysis of degrees of democracy in a school or other organizational setting. It is founded in a holistic conception of democracy, which is a model of working together that aspires to truth, goodness, and meaning and the participation of all. We suggest that the analytical framework can be…

  14. Metabolic Model-Based Integration of Microbiome Taxonomic and Metabolomic Profiles Elucidates Mechanistic Links between Ecological and Metabolic Variation.

    PubMed

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M; Young, Vincent B; Jansson, Janet K; Fredricks, David N; Borenstein, Elhanan

    2016-01-01

    Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites' abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.

  15. Metabolic Model-Based Integration of Microbiome Taxonomic and Metabolomic Profiles Elucidates Mechanistic Links between Ecological and Metabolic Variation

    PubMed Central

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.

    2016-01-01

    ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCE Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism. PMID:27239563

  16. MinOmics, an Integrative and Immersive Tool for Multi-Omics Analysis.

    PubMed

    Maes, Alexandre; Martinez, Xavier; Druart, Karen; Laurent, Benoist; Guégan, Sean; Marchand, Christophe H; Lemaire, Stéphane D; Baaden, Marc

    2018-06-21

    Proteomic and transcriptomic technologies resulted in massive biological datasets, their interpretation requiring sophisticated computational strategies. Efficient and intuitive real-time analysis remains challenging. We use proteomic data on 1417 proteins of the green microalga Chlamydomonas reinhardtii to investigate physicochemical parameters governing selectivity of three cysteine-based redox post translational modifications (PTM): glutathionylation (SSG), nitrosylation (SNO) and disulphide bonds (SS) reduced by thioredoxins. We aim to understand underlying molecular mechanisms and structural determinants through integration of redox proteome data from gene- to structural level. Our interactive visual analytics approach on an 8.3 m2 display wall of 25 MPixel resolution features stereoscopic three dimensions (3D) representation performed by UnityMol WebGL. Virtual reality headsets complement the range of usage configurations for fully immersive tasks. Our experiments confirm that fast access to a rich cross-linked database is necessary for immersive analysis of structural data. We emphasize the possibility to display complex data structures and relationships in 3D, intrinsic to molecular structure visualization, but less common for omics-network analysis. Our setup is powered by MinOmics, an integrated analysis pipeline and visualization framework dedicated to multi-omics analysis. MinOmics integrates data from various sources into a materialized physical repository. We evaluate its performance, a design criterion for the framework.

  17. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  18. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    USGS Publications Warehouse

    Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  19. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  20. METEOR: An Enterprise Health Informatics Environment to Support Evidence-Based Medicine.

    PubMed

    Puppala, Mamta; He, Tiancheng; Chen, Shenyi; Ogunti, Richard; Yu, Xiaohui; Li, Fuhai; Jackson, Robert; Wong, Stephen T C

    2015-12-01

    The aim of this paper is to propose the design and implementation of next-generation enterprise analytics platform developed at the Houston Methodist Hospital (HMH) system to meet the market and regulatory needs of the healthcare industry. For this goal, we developed an integrated clinical informatics environment, i.e., Methodist environment for translational enhancement and outcomes research (METEOR). The framework of METEOR consists of two components: the enterprise data warehouse (EDW) and a software intelligence and analytics (SIA) layer for enabling a wide range of clinical decision support systems that can be used directly by outcomes researchers and clinical investigators to facilitate data access for the purposes of hypothesis testing, cohort identification, data mining, risk prediction, and clinical research training. Data and usability analysis were performed on METEOR components as a preliminary evaluation, which successfully demonstrated that METEOR addresses significant niches in the clinical informatics area, and provides a powerful means for data integration and efficient access in supporting clinical and translational research. METEOR EDW and informatics applications improved outcomes, enabled coordinated care, and support health analytics and clinical research at HMH. The twin pressures of cost containment in the healthcare market and new federal regulations and policies have led to the prioritization of the meaningful use of electronic health records in the United States. EDW and SIA layers on top of EDW are becoming an essential strategic tool to healthcare institutions and integrated delivery networks in order to support evidence-based medicine at the enterprise level.

  1. Stability analysis of magnetized neutron stars - a semi-analytic approach

    NASA Astrophysics Data System (ADS)

    Herbrik, Marlene; Kokkotas, Kostas D.

    2017-04-01

    We implement a semi-analytic approach for stability analysis, addressing the ongoing uncertainty about stability and structure of neutron star magnetic fields. Applying the energy variational principle, a model system is displaced from its equilibrium state. The related energy density variation is set up analytically, whereas its volume integration is carried out numerically. This facilitates the consideration of more realistic neutron star characteristics within the model compared to analytical treatments. At the same time, our method retains the possibility to yield general information about neutron star magnetic field and composition structures that are likely to be stable. In contrast to numerical studies, classes of parametrized systems can be studied at once, finally constraining realistic configurations for interior neutron star magnetic fields. We apply the stability analysis scheme on polytropic and non-barotropic neutron stars with toroidal, poloidal and mixed fields testing their stability in a Newtonian framework. Furthermore, we provide the analytical scheme for dropping the Cowling approximation in an axisymmetric system and investigate its impact. Our results confirm the instability of simple magnetized neutron star models as well as a stabilization tendency in the case of mixed fields and stratification. These findings agree with analytical studies whose spectrum of model systems we extend by lifting former simplifications.

  2. An Illumination- and Temperature-Dependent Analytical Model for Copper Indium Gallium Diselenide (CIGS) Solar Cells

    DOE PAGES

    Sun, Xingshu; Silverman, Timothy; Garris, Rebekah; ...

    2016-07-18

    In this study, we present a physics-based analytical model for copper indium gallium diselenide (CIGS) solar cells that describes the illumination- and temperature-dependent current-voltage (I-V) characteristics and accounts for the statistical shunt variation of each cell. The model is derived by solving the drift-diffusion transport equation so that its parameters are physical and, therefore, can be obtained from independent characterization experiments. The model is validated against CIGS I-V characteristics as a function of temperature and illumination intensity. This physics-based model can be integrated into a large-scale simulation framework to optimize the performance of solar modules, as well as predict themore » long-term output yields of photovoltaic farms under different environmental conditions.« less

  3. An integrated analytical approach for characterizing an organic residue from an archaeological glass bottle recovered in Pompeii (Naples, Italy).

    PubMed

    Ribechini, Erika; Modugno, Francesca; Baraldi, Cecilia; Baraldi, Pietro; Colombini, Maria Perla

    2008-01-15

    Within the framework of an Italian research project aimed at studying organic residues found in archaeological objects from the Roman period, the chemical composition of the contents of several glass vessels recovered from archaeological sites from the Vesuvian area (Naples, Italy) was investigated. In particular, this paper deals with the study of an organic material found in a glass bottle from the archaeological site of Pompeii using a multi-analytical approach, including FT-IR, direct exposure mass spectrometry (DE-MS) and GC-MS techniques. The overall results suggest the occurrence of a lipid material of vegetable origin. The hypothesis that the native lipid material had been subjected to a chemical transformation procedure before being used is presented and discussed.

  4. Advances in spatial epidemiology and geographic information systems.

    PubMed

    Kirby, Russell S; Delmelle, Eric; Eberth, Jan M

    2017-01-01

    The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Towards sustainable infrastructure management: knowledge-based service-oriented computing framework for visual analytics

    NASA Astrophysics Data System (ADS)

    Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd

    2009-05-01

    Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.

  6. Integrated Data & Analysis in Support of Informed and Transparent Decision Making

    NASA Astrophysics Data System (ADS)

    Guivetchi, K.

    2012-12-01

    The California Water Plan includes a framework for improving water reliability, environmental stewardship, and economic stability through two initiatives - integrated regional water management to make better use of local water sources by integrating multiple aspects of managing water and related resources; and maintaining and improving statewide water management systems. The Water Plan promotes ways to develop a common approach for data standards and for understanding, evaluating, and improving regional and statewide water management systems, and for common ways to evaluate and select from alternative management strategies and projects. The California Water Plan acknowledges that planning for the future is uncertain and that change will continue to occur. It is not possible to know for certain how population growth, land use decisions, water demand patterns, environmental conditions, the climate, and many other factors that affect water use and supply may change by 2050. To anticipate change, our approach to water management and planning for the future needs to consider and quantify uncertainty, risk, and sustainability. There is a critical need for information sharing and information management to support over-arching and long-term water policy decisions that cross-cut multiple programs across many organizations and provide a common and transparent understanding of water problems and solutions. Achieving integrated water management with multiple benefits requires a transparent description of dynamic linkages between water supply, flood management, water quality, land use, environmental water, and many other factors. Water Plan Update 2013 will include an analytical roadmap for improving data, analytical tools, and decision-support to advance integrated water management at statewide and regional scales. It will include recommendations for linking collaborative processes with technical enhancements, providing effective analytical tools, and improving and sharing data and information. Specifically, this includes achieving better integration and consistency with other planning activities; obtaining consensus on quantitative deliverables; building a common conceptual understanding of the water management system; developing common schematics of the water management system; establishing modeling protocols and standards; and improving transparency and exchange of Water Plan information.

  7. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation.

    PubMed

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  8. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation

    NASA Astrophysics Data System (ADS)

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y.

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  9. Transfer of analytical procedures: a panel of strategies selected for risk management, with emphasis on an integrated equivalence-based comparative testing approach.

    PubMed

    Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A

    2011-09-10

    In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. A framework for analyzing the impact of data integrity/quality on electricity market operations

    NASA Astrophysics Data System (ADS)

    Choi, Dae Hyun

    This dissertation examines the impact of data integrity/quality in the supervisory control and data acquisition (SCADA) system on real-time locational marginal price (LMP) in electricity market operations. Measurement noise and/or manipulated sensor errors in a SCADA system may mislead system operators about real-time conditions in a power system, which, in turn, may impact the price signals in real-time power markets. This dissertation serves as a first attempt to analytically investigate the impact of bad/malicious data on electric power market operations. In future power system operations, which will probably involve many more sensors, the impact of sensor data integrity/quality on grid operations will become increasingly important. The first part of this dissertation studies from a market participant's perspective a new class of malicious data attacks on state estimation, which subsequently influences the result of the newly emerging look-ahead dispatch models in the real-time power market. In comparison with prior work of cyber-attack on static dispatch where no inter-temporal ramping constraint is considered, we propose a novel attack strategy, named ramp-induced data (RID) attack, with which the attacker can manipulate the limits of ramp constraints of generators in look-ahead dispatch. It is demonstrated that the proposed attack can lead to financial profits via malicious capacity withholding of selected generators, while being undetected by the existing bad data detection algorithm embedded in today's state estimation software. In the second part, we investigate from a system operator's perspective the sensitivity of locational marginal price (LMP) with respect to data corruption-induced state estimation error in real-time power market. Two data corruption scenarios are considered, in which corrupted continuous data (e.g., the power injection/flow and voltage magnitude) falsify power flow estimate whereas corrupted discrete data (e.g., the on/off status of a circuit breaker) do network topology estimate, thus leading to the distortion of LMP. We present an analytical framework to quantify real-time LMP sensitivity subject to continuous and discrete data corruption via state estimation. The proposed framework offers system operators an analytical tool to identify economically sensitive buses and transmission lines to data corruption as well as find sensors that impact LMP changes significantly. This dissertation serves as a first step towards rigorous understanding of the fundamental coupling among cyber, physical and economical layers of operations in future smart grid.

  11. Symposium on Integrating the Science of Environmental Justice into Decision-Making at the Environmental Protection Agency: An Overview

    PubMed Central

    Payne-Sturges, Devon; Garcia, Lisa; Lee, Charles; Zenick, Hal; Grevatt, Peter; Sanders, William H.; Case, Heather; Dankwa-Mullan, Irene

    2011-01-01

    In March 2010, the Environmental Protection Agency (EPA) collaborated with government and nongovernmental organizations to host a groundbreaking symposium, “Strengthening Environmental Justice Research and Decision Making: A Symposium on the Science of Disproportionate Environmental Health Impacts.” The symposium provided a forum for discourse on the state of scientific knowledge about factors identified by EPA that may contribute to higher burdens of environmental exposure or risk in racial/ethnic minorities and low-income populations. Also featured were discussions on how environmental justice considerations may be integrated into EPA's analytical and decision-making frameworks and on research needs for advancing the integration of environmental justice into environmental policymaking. We summarize key discussions and conclusions from the symposium and briefly introduce the articles in this issue. PMID:22028456

  12. Comments on "A Closed-Form Solution to Tensor Voting: Theory and Applications".

    PubMed

    Maggiori, Emmanuel; Lotito, Pablo; Manterola, Hugo Luis; del Fresno, Mariana

    2014-12-01

    We comment on a paper that describes a closed-form formulation to Tensor Voting, a technique to perceptually group clouds of points, usually applied to infer features in images. The authors proved an analytic solution to the technique, a highly relevant contribution considering that the original formulation required numerical integration, a time-consuming task. Their work constitutes the first closed-form expression for the Tensor Voting framework. In this work we first observe that the proposed formulation leads to unexpected results which do not satisfy the constraints for a Tensor Voting output, hence they cannot be interpreted. Given that the closed-form expression is said to be an analytic equivalent solution, unexpected outputs should not be encountered unless there are flaws in the proof. We analyzed the underlying math to find which were the causes of these unexpected results. In this commentary we show that their proposal does not in fact provide a proper analytic solution to Tensor Voting and we indicate the flaws in the proof.

  13. From famine to food crisis: what history can teach us about local and global subsistence crises.

    PubMed

    Vanhaute, Eric

    2011-01-01

    The number of famine prone regions in the world has been shrinking for centuries. It is currently mainly limited to sub-Saharan Africa. Yet the impact of endemic hunger has not declined and the early twenty-first century seems to be faced with a new threat: global subsistence crises. In this essay I question the concepts of famine and food crisis from different analytical angles: historical and contemporary famine research, food regime theory, and peasant studies. I will argue that only a more integrated historical framework of analysis can surpass dualistic interpretations grounded in Eurocentric modernization paradigms. This article successively debates historical and contemporary famine research, the contemporary food regime and the new global food crisis, the lessons from Europe's 'grand escape' from hunger, and the peasantry and 'depeasantization' as central analytical concepts. Dualistic histories of food and famine have been dominating developmentalist stories for too long. This essay shows how a blending of historical and contemporary famine research, food regime theory and new peasant studies can foster a more integrated perspective.

  14. Authentic Oral Language Production and Interaction in CALL: An Evolving Conceptual Framework for the Use of Learning Analytics within the SpeakApps Project

    ERIC Educational Resources Information Center

    Nic Giolla Mhichíl, Mairéad; van Engen, Jeroen; Ó Ciardúbháin, Colm; Ó Cléircín, Gearóid; Appel, Christine

    2014-01-01

    This paper sets out to construct and present the evolving conceptual framework of the SpeakApps projects to consider the application of learning analytics to facilitate synchronous and asynchronous oral language skills within this CALL context. Drawing from both the CALL and wider theoretical and empirical literature of learner analytics, the…

  15. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    PubMed

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  16. Using a fuzzy comprehensive evaluation method to determine product usability: A proposed theoretical framework

    PubMed Central

    Zhou, Ronggang; Chan, Alan H. S.

    2016-01-01

    BACKGROUND: In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. OBJECTIVE: This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. METHODS: With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. RESULTS AND CONCLUSIONS: Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process. PMID:28035943

  17. Real-Time Analytics for the Healthcare Industry: Arrhythmia Detection.

    PubMed

    Agneeswaran, Vijay Srinivas; Mukherjee, Joydeb; Gupta, Ashutosh; Tonpay, Pranay; Tiwari, Jayati; Agarwal, Nitin

    2013-09-01

    It is time for the healthcare industry to move from the era of "analyzing our health history" to the age of "managing the future of our health." In this article, we illustrate the importance of real-time analytics across the healthcare industry by providing a generic mechanism to reengineer traditional analytics expressed in the R programming language into Storm-based real-time analytics code. This is a powerful abstraction, since most data scientists use R to write the analytics and are not clear on how to make the data work in real-time and on high-velocity data. Our paper focuses on the applications necessary to a healthcare analytics scenario, specifically focusing on the importance of electrocardiogram (ECG) monitoring. A physician can use our framework to compare ECG reports by categorization and consequently detect Arrhythmia. The framework can read the ECG signals and uses a machine learning-based categorizer that runs within a Storm environment to compare different ECG signals. The paper also presents some performance studies of the framework to illustrate the throughput and accuracy trade-off in real-time analytics.

  18. The Geek Perspective: Answering the Call for Advanced Technology in Research Inquiry Related to Pediatric Brain Injury and Motor Disability.

    PubMed

    Wininger, Michael; Pidcoe, Peter

    2017-10-01

    The Academy of Pediatric Physical Therapy Research Summit IV issued a Call to Action for community-wide intensification of a research enterprise in inquiries related to pediatric brain injury and motor disability by way of technological integration. But the barriers can seem high, and the pathways to integrative clinical research can seem poorly marked. Here, we answer the Call by providing framework to 3 objectives: (1) instrumentation, (2) biometrics and study design, and (3) data analytics. We identify emergent cases where this Call has been answered and advocate for others to echo the Call both in highly visible physical therapy venues and in forums where the audience is diverse.

  19. Crossover physics in the nonequilibrium dynamics of quenched quantum impurity systems.

    PubMed

    Vasseur, Romain; Trinh, Kien; Haas, Stephan; Saleur, Hubert

    2013-06-14

    A general framework is proposed to tackle analytically local quantum quenches in integrable impurity systems, combining a mapping onto a boundary problem with the form factor approach to boundary-condition-changing operators introduced by Lesage and Saleur [Phys. Rev. Lett. 80, 4370 (1998)]. We discuss how to compute exactly the following two central quantities of interest: the Loschmidt echo and the distribution of the work done during the quantum quench. Our results display an interesting crossover physics characterized by the energy scale T(b) of the impurity corresponding to the Kondo temperature. We discuss in detail the noninteracting case as a paradigm and benchmark for more complicated integrable impurity models and check our results using numerical methods.

  20. Development of an Integrated Modeling Framework for Simulations of Coastal Processes in Deltaic Environments Using High-Performance Computing

    DTIC Science & Technology

    2008-01-01

    exceeds the local water depth. The approximation eliminates the vertical dimension of the elliptic equation that is normally required for the fully non...used for vertical resolution. The shallow water equations (SWE) are a set of non-linear hyperbolic equations. As the equations are derived under...linear standing wave with a wavelength of 10 m in a square 10 m by 10 m basin. The still water depth is 0.5 m. In order to compare with the analytical

  1. Dispersive analysis of ω/Φ → 3π, πγ*

    DOE PAGES

    Danilkin, Igor V.; Fernandez Ramirez, Cesar; Guo, Peng; ...

    2015-05-01

    The decays ω/Φ → 3π are considered in the dispersive framework that is based on the isobar decomposition and subenergy unitarity. The inelastic contributions are parametrized by the power series in a suitably chosen conformal variable that properly accounts for the analytic properties of the amplitude. The Dalitz plot distributions and integrated decay widths are presented. Our results indicate that the final- state interactions may be sizable. As a further application of the formalism we also compute the electromagnetic transition form factors of ω/Φ → π⁰γ*.

  2. Metal–Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T.

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal–organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability ofmore » MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2, N 2, O 2, and CO) with rapid (< tens of seconds) response time and excellent reversibility, which can be well correlated to the physisorption of gases into a nanoporous MOF. We propose a refractive index based sensing mechanism for the MOF-integrated optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.« less

  3. Metal–Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform

    DOE PAGES

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T.; ...

    2018-01-18

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal–organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability ofmore » MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2, N 2, O 2, and CO) with rapid (< tens of seconds) response time and excellent reversibility, which can be well correlated to the physisorption of gases into a nanoporous MOF. We propose a refractive index based sensing mechanism for the MOF-integrated optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.« less

  4. Energy-Efficient Integration of Continuous Context Sensing and Prediction into Smartwatches.

    PubMed

    Rawassizadeh, Reza; Tomitsch, Martin; Nourizadeh, Manouchehr; Momeni, Elaheh; Peery, Aaron; Ulanova, Liudmila; Pazzani, Michael

    2015-09-08

    As the availability and use of wearables increases, they are becoming a promising platform for context sensing and context analysis. Smartwatches are a particularly interesting platform for this purpose, as they offer salient advantages, such as their proximity to the human body. However, they also have limitations associated with their small form factor, such as processing power and battery life, which makes it difficult to simply transfer smartphone-based context sensing and prediction models to smartwatches. In this paper, we introduce an energy-efficient, generic, integrated framework for continuous context sensing and prediction on smartwatches. Our work extends previous approaches for context sensing and prediction on wrist-mounted wearables that perform predictive analytics outside the device. We offer a generic sensing module and a novel energy-efficient, on-device prediction module that is based on a semantic abstraction approach to convert sensor data into meaningful information objects, similar to human perception of a behavior. Through six evaluations, we analyze the energy efficiency of our framework modules, identify the optimal file structure for data access and demonstrate an increase in accuracy of prediction through our semantic abstraction method. The proposed framework is hardware independent and can serve as a reference model for implementing context sensing and prediction on small wearable devices beyond smartwatches, such as body-mounted cameras.

  5. Energy-Efficient Integration of Continuous Context Sensing and Prediction into Smartwatches

    PubMed Central

    Rawassizadeh, Reza; Tomitsch, Martin; Nourizadeh, Manouchehr; Momeni, Elaheh; Peery, Aaron; Ulanova, Liudmila; Pazzani, Michael

    2015-01-01

    As the availability and use of wearables increases, they are becoming a promising platform for context sensing and context analysis. Smartwatches are a particularly interesting platform for this purpose, as they offer salient advantages, such as their proximity to the human body. However, they also have limitations associated with their small form factor, such as processing power and battery life, which makes it difficult to simply transfer smartphone-based context sensing and prediction models to smartwatches. In this paper, we introduce an energy-efficient, generic, integrated framework for continuous context sensing and prediction on smartwatches. Our work extends previous approaches for context sensing and prediction on wrist-mounted wearables that perform predictive analytics outside the device. We offer a generic sensing module and a novel energy-efficient, on-device prediction module that is based on a semantic abstraction approach to convert sensor data into meaningful information objects, similar to human perception of a behavior. Through six evaluations, we analyze the energy efficiency of our framework modules, identify the optimal file structure for data access and demonstrate an increase in accuracy of prediction through our semantic abstraction method. The proposed framework is hardware independent and can serve as a reference model for implementing context sensing and prediction on small wearable devices beyond smartwatches, such as body-mounted cameras. PMID:26370997

  6. An integrated new product development framework - an application on green and low-carbon products

    NASA Astrophysics Data System (ADS)

    Lin, Chun-Yu; Lee, Amy H. I.; Kang, He-Yau

    2015-03-01

    Companies need to be innovative to survive in today's competitive market; thus, new product development (NPD) has become very important. This research constructs an integrated NPD framework for developing new products. In stage one, customer attributes (CAs) and engineering characteristics (ECs) for developing products are collected, and fuzzy interpretive structural modelling (FISM) is applied to understand the relationships among these critical factors. Based on quality function deployment (QFD), a house of quality is then built, and fuzzy analytic network process (FANP) is adopted to calculate the relative importance of ECs. In stage two, fuzzy failure mode and effects analysis (FFMEA) is applied to understand the potential failures of the ECs and to determine the importance of ECs with respect to risk control. In stage three, a goal programming (GP) model is constructed to consider the outcome from the FANP-QFD, FFMEA and other objectives, in order to select the most important ECs. Due to pollution and global warming, environmental protection has become an important topic. With both governments and consumers developing environmental consciousness, successful green and low-carbon NPD provides an important competitive advantage, enabling the survival or renewal of firms. The proposed framework is implemented in a panel manufacturing firm for designing a green and low-carbon product.

  7. Framework for assessing key variable dependencies in loose-abrasive grinding and polishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, J.S.; Aikens, D.M.; Brown, N.J.

    1995-12-01

    This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.

  8. RT-18: Value of Flexibility. Phase 1

    DTIC Science & Technology

    2010-09-25

    an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state

  9. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics

    PubMed Central

    2017-01-01

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473

  10. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.

    PubMed

    Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier

    2017-10-21

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.

  11. Global Simulation of Bioenergy Crop Productivity: Analytical Framework and Case Study for Switchgrass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, Shujiang; Kline, Keith L; Nair, S. Surendran

    A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulatedmore » a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.« less

  12. ICSNPathway: identify candidate causal SNPs and pathways from genome-wide association study by one analytical framework.

    PubMed

    Zhang, Kunlin; Chang, Suhua; Cui, Sijia; Guo, Liyuan; Zhang, Liuyan; Wang, Jing

    2011-07-01

    Genome-wide association study (GWAS) is widely utilized to identify genes involved in human complex disease or some other trait. One key challenge for GWAS data interpretation is to identify causal SNPs and provide profound evidence on how they affect the trait. Currently, researches are focusing on identification of candidate causal variants from the most significant SNPs of GWAS, while there is lack of support on biological mechanisms as represented by pathways. Although pathway-based analysis (PBA) has been designed to identify disease-related pathways by analyzing the full list of SNPs from GWAS, it does not emphasize on interpreting causal SNPs. To our knowledge, so far there is no web server available to solve the challenge for GWAS data interpretation within one analytical framework. ICSNPathway is developed to identify candidate causal SNPs and their corresponding candidate causal pathways from GWAS by integrating linkage disequilibrium (LD) analysis, functional SNP annotation and PBA. ICSNPathway provides a feasible solution to bridge the gap between GWAS and disease mechanism study by generating hypothesis of SNP → gene → pathway(s). The ICSNPathway server is freely available at http://icsnpathway.psych.ac.cn/.

  13. Interoperable Data Sharing for Diverse Scientific Disciplines

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  14. Completing the link between exposure science and toxicology for improved environmental health decision making: The aggregate exposure pathway framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teeguarden, Justin G.; Tan, Yu -Mei; Edwards, Stephen W.

    Here, driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences.more » Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making.« less

  15. Completing the link between exposure science and toxicology for improved environmental health decision making: The aggregate exposure pathway framework

    DOE PAGES

    Teeguarden, Justin G.; Tan, Yu -Mei; Edwards, Stephen W.; ...

    2016-01-13

    Here, driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences.more » Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making.« less

  16. Sandplay therapy with couples within the framework of analytical psychology.

    PubMed

    Albert, Susan Carol

    2015-02-01

    Sandplay therapy with couples is discussed within an analytical framework. Guidelines are proposed as a means of developing this relatively new area within sandplay therapy, and as a platform to open a wider discussion to bring together sandplay therapy and couple therapy. Examples of sand trays created during couple therapy are also presented to illustrate the transformations during the therapeutic process. © 2015, The Society of Analytical Psychology.

  17. E-HOSPITAL - A Digital Workbench for Hospital Operations and Services Planning Using Information Technology and Algebraic Languages.

    PubMed

    Gartner, Daniel; Padman, Rema

    2017-01-01

    In this paper, we describe the development of a unified framework and a digital workbench for the strategic, tactical and operational hospital management plan driven by information technology and analytics. The workbench can be used not only by multiple stakeholders in the healthcare delivery setting, but also for pedagogical purposes on topics such as healthcare analytics, services management, and information systems. This tool combines the three classical hierarchical decision-making levels in one integrated environment. At each level, several decision problems can be chosen. Extensions of mathematical models from the literature are presented and incorporated into the digital platform. In a case study using real-world data, we demonstrate how we used the workbench to inform strategic capacity planning decisions in a multi-hospital, multi-stakeholder setting in the United Kingdom.

  18. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  19. Drawing Sensors with Ball-Milled Blends of Metal-Organic Frameworks and Graphite

    PubMed Central

    Ko, Michael; Aykanat, Aylin; Smith, Merry K.

    2017-01-01

    The synthetically tunable properties and intrinsic porosity of conductive metal-organic frameworks (MOFs) make them promising materials for transducing selective interactions with gaseous analytes in an electrically addressable platform. Consequently, conductive MOFs are valuable functional materials with high potential utility in chemical detection. The implementation of these materials, however, is limited by the available methods for device incorporation due to their poor solubility and moderate electrical conductivity. This manuscript describes a straightforward method for the integration of moderately conductive MOFs into chemiresistive sensors by mechanical abrasion. To improve electrical contacts, blends of MOFs with graphite were generated using a solvent-free ball-milling procedure. While most bulk powders of pure conductive MOFs were difficult to integrate into devices directly via mechanical abrasion, the compressed solid-state MOF/graphite blends were easily abraded onto the surface of paper substrates equipped with gold electrodes to generate functional sensors. This method was used to prepare an array of chemiresistors, from four conductive MOFs, capable of detecting and differentiating NH3, H2S and NO at parts-per-million concentrations. PMID:28946624

  20. Integrated catchment modelling within a strategic planning and decision making process: Werra case study

    NASA Astrophysics Data System (ADS)

    Dietrich, Jörg; Funke, Markus

    Integrated water resources management (IWRM) redefines conventional water management approaches through a closer cross-linkage between environment and society. The role of public participation and socio-economic considerations becomes more important within the planning and decision making process. In this paper we address aspects of the integration of catchment models into such a process taking the implementation of the European Water Framework Directive (WFD) as an example. Within a case study situated in the Werra river basin (Central Germany), a systems analytic decision process model was developed. This model uses the semantics of the Unified Modeling Language (UML) activity model. As an example application, the catchment model SWAT and the water quality model RWQM1 were applied to simulate the effect of phosphorus emissions from non-point and point sources on water quality. The decision process model was able to guide the participants of the case study through the interdisciplinary planning and negotiation of actions. Further improvements of the integration framework include tools for quantitative uncertainty analyses, which are crucial for real life application of models within an IWRM decision making toolbox. For the case study, the multi-criteria assessment of actions indicates that the polluter pays principle can be met at larger scales (sub-catchment or river basin) without significantly compromising cost efficiency for the local situation.

  1. A model of "integrated scientific method" and its application for the analysis of instruction

    NASA Astrophysics Data System (ADS)

    Rusbult, Craig Francis

    A model of 'integrated scientific method' (ISM) was constructed as a framework for describing the process of science in terms of activities (formulating a research problem, and inventing and evaluating actions--such as selecting and inventing theories, evaluating theories, designing experiments, and doing experiments--intended to solve the problem) and evaluation criteria (empirical, conceptual, and cultural-personal). Instead of trying to define the scientific method, ISM is intended to serve as a flexible framework that--by varying the characteristics of its components, their integrated relationships, and their relative importance can be used to describe a variety of scientific methods, and a variety of perspectives about what constitutes an accurate portrayal of scientific methods. This framework is outlined visually and verbally, followed by an elaboration of the framework and my own views about science, and an evaluation of whether ISM can serve as a relatively neutral framework for describing a wide range of science practices and science interpretations. ISM was used to analyze an innovative, guided inquiry classroom (taught by Susan Johnson, using Genetics Construction Kit software) in which students do simulated scientific research by solving classical genetics problems that require effect-to-cause reasoning and theory revision. The immediate goal of analysis was to examine the 'science experiences' of students, to determine how the 'structure of instruction' provides opportunities for these experiences. Another goal was to test and improve the descriptive and analytical utility of ISM. In developing ISM, a major objective was to make ISM educationally useful. A concluding discussion includes controversies about "the nature of science" and how to teach it, how instruction can expand opportunities for student experience, and how goal-oriented intentional learning (using ISM might improve the learning, retention, and transfer of thinking skills. Potential educational applications of ISM could involve its use for instructional analysis or design, or for teaching students in the classroom; or ISM and IDM (a closely related, generalized 'integrated design method') could play valuable roles in a 'wide spiral' curriculum designed for the coordinated teaching of thinking skills, including creativity and critical thinking, across a wide range of subjects.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha

    ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community.more » Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCEStudies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.« less

  3. Integrating count and detection–nondetection data to model population dynamics

    USGS Publications Warehouse

    Zipkin, Elise F.; Rossman, Sam; Yackulic, Charles B.; Wiens, David; Thorson, James T.; Davis, Raymond J.; Grant, Evan H. Campbell

    2017-01-01

    There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture–recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection–nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection–nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection–nondetection data (1995–2014) with newly collected count data (2015–2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance.

  4. Integrating count and detection-nondetection data to model population dynamics.

    PubMed

    Zipkin, Elise F; Rossman, Sam; Yackulic, Charles B; Wiens, J David; Thorson, James T; Davis, Raymond J; Grant, Evan H Campbell

    2017-06-01

    There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture-recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection-nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection-nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection-nondetection data (1995-2014) with newly collected count data (2015-2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance. © 2017 by the Ecological Society of America.

  5. Firing patterns in the adaptive exponential integrate-and-fire model.

    PubMed

    Naud, Richard; Marcille, Nicolas; Clopath, Claudia; Gerstner, Wulfram

    2008-11-01

    For simulations of large spiking neuron networks, an accurate, simple and versatile single-neuron modeling framework is required. Here we explore the versatility of a simple two-equation model: the adaptive exponential integrate-and-fire neuron. We show that this model generates multiple firing patterns depending on the choice of parameter values, and present a phase diagram describing the transition from one firing type to another. We give an analytical criterion to distinguish between continuous adaption, initial bursting, regular bursting and two types of tonic spiking. Also, we report that the deterministic model is capable of producing irregular spiking when stimulated with constant current, indicating low-dimensional chaos. Lastly, the simple model is fitted to real experiments of cortical neurons under step current stimulation. The results provide support for the suitability of simple models such as the adaptive exponential integrate-and-fire neuron for large network simulations.

  6. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  7. Ensuring Food Integrity by Metrology and FAIR Data Principles

    PubMed Central

    Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F. X.; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C.; Presser, Karl; Zoani, Claudia

    2018-01-01

    Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly, and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its “Early Phase” as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour, and oyster tissue) and establishing a system how to compile, process, and store the generated data and how to exchange, compare them and make them accessible in data bases. PMID:29872651

  8. Ensuring Food Integrity by Metrology and FAIR Data Principles

    NASA Astrophysics Data System (ADS)

    Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F. X.; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C.; Presser, Karl; Zoani, Claudia

    2018-05-01

    Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its “Early Phase” as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour and oyster tissue) and establishing a system how to compile, process and store the generated data and how to exchange, compare them and make them accessible in data bases.

  9. Ensuring Food Integrity by Metrology and FAIR Data Principles.

    PubMed

    Rychlik, Michael; Zappa, Giovanna; Añorga, Larraitz; Belc, Nastasia; Castanheira, Isabel; Donard, Olivier F X; Kouřimská, Lenka; Ogrinc, Nives; Ocké, Marga C; Presser, Karl; Zoani, Claudia

    2018-01-01

    Food integrity is a general term for sound, nutritive, healthy, tasty, safe, authentic, traceable, as well as ethically, safely, environment-friendly, and sustainably produced foods. In order to verify these properties, analytical methods with a higher degree of accuracy, sensitivity, standardization and harmonization and a harmonized system for their application in analytical laboratories are required. In this view, metrology offers the opportunity to achieve these goals. In this perspective article the current global challenges in food analysis and the principles of metrology to fill these gaps are presented. Therefore, the pan-European project METROFOOD-RI within the framework of the European Strategy Forum on Research Infrastructures (ESFRI) was developed to establish a strategy to allow reliable and comparable analytical measurements in foods along the whole process line starting from primary producers until consumers and to make all data findable, accessible, interoperable, and re-usable according to the FAIR data principles. The initiative currently consists of 48 partners from 18 European Countries and concluded its "Early Phase" as research infrastructure by organizing its future structure and presenting a proof of concept by preparing, distributing and comprehensively analyzing three candidate Reference Materials (rice grain, rice flour, and oyster tissue) and establishing a system how to compile, process, and store the generated data and how to exchange, compare them and make them accessible in data bases.

  10. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  11. Analytic Frameworks for Assessing Dialogic Argumentation in Online Learning Environments

    ERIC Educational Resources Information Center

    Clark, Douglas B; Sampson, Victor; Weinberger, Armin; Erkens, Gijsbert

    2007-01-01

    Over the last decade, researchers have developed sophisticated online learning environments to support students engaging in dialogic argumentation. This review examines five categories of analytic frameworks for measuring participant interactions within these environments focusing on (1) formal argumentation structure, (2) conceptual quality, (3)…

  12. Framework for developing a spatial walkability index (SWI) for the light-rail transit (LRT) stations in Kuala Lumpur city centre using analytical network process (ANP) and GIS

    NASA Astrophysics Data System (ADS)

    Naharudin, Nabilah; Ahamad, Mohd Sanusi S.; Sadullah, Ahmad Farhan Mohd

    2017-10-01

    In support to the nation's goal of developing a liveable city, Malaysian government aims to improve the mobility in Kuala Lumpur by providing good quality transit services across the city. However, the public starts to demand for more than just a connectivity between two points. They want their transit journey to be comfortable and pleasant from the very first mile. The key here is the first and last mile (FLM) of the transit service which defines their journey to access the station itself. The question is, does the existing transit services' FLM satisfy public's needs? Therefore, many studies had emerged in attempt to assess the pedestrian-friendliness. While most of them did base on the pedestrian's perceptions, there were also studies that spatially measured the connectivity and accessibility to various landuses and point of interests. While both can be a good method, their integration could actually produce a better assessment. However, till date, only a few studies had attempted to do so. This paper proposes a framework to develop a Spatial Walkability Index (SWI) by integrating a multicriteria evaluation technique, Analytical Network Process (ANP) and network analysis on geographical information system (GIS) platform. First, ANP will aggregate the degree of importance for each walkability criteria based on the pedestrian's perceptions. Then, the network analysis will use the weighted criteria as attributes to find the walkable routes within half mile radius from each station. The index will be calculated by rationing the total length of walkable routes in respect to the available footpath. The final outcome is a percentage of walkable FLM transit routes for each station which will be named as the SWI. It is expected that the developed framework can be applied in other cities across the globe. It can also be improvised to suit the demand and purpose there.

  13. Investigation of microcantilever array with ordered nanoporous coatings for selective chemical detection

    NASA Astrophysics Data System (ADS)

    Lee, J.-H.; Houk, R. T. J.; Robinson, A.; Greathouse, J. A.; Thornberg, S. M.; Allendorf, M. D.; Hesketh, P. J.

    2010-04-01

    In this paper we demonstrate the potential for novel nanoporous framework materials (NFM) such as metal-organic frameworks (MOFs) to provide selectivity and sensitivity to a broad range of analytes including explosives, nerve agents, and volatile organic compounds (VOCs). NFM are highly ordered, crystalline materials with considerable synthetic flexibility resulting from the presence of both organic and inorganic components within their structure. Detection of chemical weapons of mass destruction (CWMD), explosives, toxic industrial chemicals (TICs), and volatile organic compounds (VOCs) using micro-electro-mechanical-systems (MEMS) devices, such as microcantilevers and surface acoustic wave sensors, requires the use of recognition layers to impart selectivity. Traditional organic polymers are dense, impeding analyte uptake and slowing sensor response. The nanoporosity and ultrahigh surface areas of NFM enhance transport into and out of the NFM layer, improving response times, and their ordered structure enables structural tuning to impart selectivity. Here we describe experiments and modeling aimed at creating NFM layers tailored to the detection of water vapor, explosives, CWMD, and VOCs, and their integration with the surfaces of MEMS devices. Force field models show that a high degree of chemical selectivity is feasible. For example, using a suite of MOFs it should be possible to select for explosives vs. CWMD, VM vs. GA (nerve agents), and anthracene vs. naphthalene (VOCs). We will also demonstrate the integration of various NFM with the surfaces of MEMS devices and describe new synthetic methods developed to improve the quality of VFM coatings. Finally, MOF-coated MEMS devices show how temperature changes can be tuned to improve response times, selectivity, and sensitivity.

  14. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  15. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  16. Integrated assessment of urban drainage system under the framework of uncertainty analysis.

    PubMed

    Dong, X; Chen, J; Zeng, S; Zhao, D

    2008-01-01

    Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.

  17. Technosocial Modeling of IED Threat Scenarios and Attacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, Paul D.; Brothers, Alan J.; Coles, Garill A.

    2009-03-23

    This paper describes an approach for integrating sociological and technical models to develop more complete threat assessment. Current approaches to analyzing and addressing threats tend to focus on the technical factors. This paper addresses development of predictive models that encompass behavioral as well as these technical factors. Using improvised explosive device (IED) attacks as motivation, this model supports identification of intervention activities 'left of boom' as well as prioritizing attack modalities. We show how Bayes nets integrate social factors associated with IED attacks into general threat model containing technical and organizational steps from planning through obtaining the IED to initiationmore » of the attack. The social models are computationally-based representations of relevant social science literature that describes human decision making and physical factors. When combined with technical models, the resulting model provides improved knowledge integration into threat assessment for monitoring. This paper discusses the construction of IED threat scenarios, integration of diverse factors into an analytical framework for threat assessment, indicator identification for future threats, and future research directions.« less

  18. An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.

    ERIC Educational Resources Information Center

    Lee, Chung-Shing

    2001-01-01

    Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)

  19. Reasoning across Ontologically Distinct Levels: Students' Understandings of Molecular Genetics

    ERIC Educational Resources Information Center

    Duncan, Ravit Golan; Reiser, Brian J.

    2007-01-01

    In this article we apply a novel analytical framework to explore students' difficulties in understanding molecular genetics--a domain that is particularly challenging to learn. Our analytical framework posits that reasoning in molecular genetics entails mapping across ontologically distinct levels--an information level containing the genetic…

  20. The Illness Narratives of Health Managers: Developing an Analytical Framework

    ERIC Educational Resources Information Center

    Exworthy, Mark

    2011-01-01

    This paper examines the personal experience of illness and healthcare by health managers through their illness narratives. By synthesising a wider literature of illness narratives and health management, an analytical framework is presented, which considers the impact of illness narratives, comprising the logic of illness narratives, the actors…

  1. A Data Protection Framework for Learning Analytics

    ERIC Educational Resources Information Center

    Cormack, Andrew

    2016-01-01

    Most studies on the use of digital student data adopt an ethical framework derived from human-subject research, based on the informed consent of the experimental subject. However, consent gives universities little guidance on using learning analytics as a routine part of educational provision: which purposes are legitimate and which analyses…

  2. a Multidisciplinary Analytical Framework for Studying Active Mobility Patterns

    NASA Astrophysics Data System (ADS)

    Orellana, D.; Hermida, C.; Osorio, P.

    2016-06-01

    Intermediate cities are urged to change and adapt their mobility systems from a high energy-demanding motorized model to a sustainable low-motorized model. In order to accomplish such a model, city administrations need to better understand active mobility patterns and their links to socio-demographic and cultural aspects of the population. During the last decade, researchers have demonstrated the potential of geo-location technologies and mobile devices to gather massive amounts of data for mobility studies. However, the analysis and interpretation of this data has been carried out by specialized research groups with relatively narrow approaches from different disciplines. Consequently, broader questions remain less explored, mainly those relating to spatial behaviour of individuals and populations with their geographic environment and the motivations and perceptions shaping such behaviour. Understanding sustainable mobility and exploring new research paths require an interdisciplinary approach given the complex nature of mobility systems and their social, economic and environmental impacts. Here, we introduce the elements for a multidisciplinary analytical framework for studying active mobility patterns comprised of three components: a) Methodological, b) Behavioural, and c) Perceptual. We demonstrate the applicability of the framework by analysing mobility patterns of cyclists and pedestrians in an intermediate city integrating a range of techniques, including: GPS tracking, spatial analysis, auto-ethnography, and perceptual mapping. The results demonstrated the existence of non-evident spatial behaviours and how perceptual features affect mobility. This knowledge is useful for developing policies and practices for sustainable mobility planning.

  3. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations

    NASA Astrophysics Data System (ADS)

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-01

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  4. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations.

    PubMed

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-14

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  5. Microcirculation and the physiome projects.

    PubMed

    Bassingthwaighte, James B

    2008-11-01

    The Physiome projects comprise a loosely knit worldwide effort to define the Physiome through databases and theoretical models, with the goal of better understanding the integrative functions of cells, organs, and organisms. The projects involve developing and archiving models, providing centralized databases, and linking experimental information and models from many laboratories into self-consistent frameworks. Increasingly accurate and complete models that embody quantitative biological hypotheses, adhere to high standards, and are publicly available and reproducible, together with refined and curated data, will enable biological scientists to advance integrative, analytical, and predictive approaches to the study of medicine and physiology. This review discusses the rationale and history of the Physiome projects, the role of theoretical models in the development of the Physiome, and the current status of efforts in this area addressing the microcirculation.

  6. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  7. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  8. A long-term study of ecological impacts of river channelization on the population of an endangered fish: Lessons learned for assessment and restoration

    USGS Publications Warehouse

    Roberts, James H.; Anderson, Gregory B.; Angermeier, Paul

    2016-01-01

    Projects to assess environmental impact or restoration success in rivers focus on project-specific questions but can also provide valuable insights for future projects. Both restoration actions and impact assessments can become “adaptive” by using the knowledge gained from long-term monitoring and analysis to revise the actions, monitoring, conceptual model, or interpretation of findings so that subsequent actions or assessments are better informed. Assessments of impact or restoration success are especially challenging when the indicators of interest are imperiled species and/or the impacts being addressed are complex. From 1997 to 2015, we worked closely with two federal agencies to monitor habitat availability for and population density of Roanoke logperch (Percina rex), an endangered fish, in a 24-km-long segment of the upper Roanoke River, VA. We primarily used a Before-After-Control-Impact analytical framework to assess potential impacts of a river channelization project on the P. rex population. In this paper, we summarize how our extensive monitoring facilitated the evolution of our (a) conceptual understanding of the ecosystem and fish population dynamics; (b) choices of ecological indicators and analytical tools; and (c) conclusions regarding the magnitude, mechanisms, and significance of observed impacts. Our experience with this case study taught us important lessons about how to adaptively develop and conduct a monitoring program, which we believe are broadly applicable to assessments of environmental impact and restoration success in other rivers. In particular, we learned that (a) pre-treatment planning can enhance monitoring effectiveness, help avoid unforeseen pitfalls, and lead to more robust conclusions; (b) developing adaptable conceptual and analytical models early was crucial to organizing our knowledge, guiding our study design, and analyzing our data; (c) catchment-wide processes that we did not monitor, or initially consider, had profound implications for interpreting our findings; and (d) using multiple analytical frameworks, with varying assumptions, led to clearer interpretation of findings than the use of a single framework alone. Broader integration of these guiding principles into monitoring studies, though potentially challenging, could lead to more scientifically defensible assessments of project effects.

  9. Assessing Proposals for New Global Health Treaties: An Analytic Framework.

    PubMed

    Hoffman, Steven J; Røttingen, John-Arne; Frenk, Julio

    2015-08-01

    We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties.

  10. Assessing Proposals for New Global Health Treaties: An Analytic Framework

    PubMed Central

    Røttingen, John-Arne; Frenk, Julio

    2015-01-01

    We have presented an analytic framework and 4 criteria for assessing when global health treaties have reasonable prospects of yielding net positive effects. First, there must be a significant transnational dimension to the problem being addressed. Second, the goals should justify the coercive nature of treaties. Third, proposed global health treaties should have a reasonable chance of achieving benefits. Fourth, treaties should be the best commitment mechanism among the many competing alternatives. Applying this analytic framework to 9 recent calls for new global health treaties revealed that none fully meet the 4 criteria. Efforts aiming to better use or revise existing international instruments may be more productive than is advocating new treaties. PMID:26066926

  11. Hand synergies: Integration of robotics and neuroscience for understanding the control of biological and artificial hands

    PubMed Central

    Santello, Marco; Bianchi, Matteo; Gabiccini, Marco; Ricciardi, Emiliano; Salvietti, Gionata; Prattichizzo, Domenico; Ernst, Marc; Moscatelli, Alessandro; Jörntell, Henrik; Kappers, Astrid M.L.; Kyriakopoulos, Kostas; Albu-Schäffer, Alin; Castellini, Claudio; Bicchi, Antonio

    2017-01-01

    The term ‘synergy’ – from the Greek synergia – means ‘working together’. The concept of multiple elements working together towards a common goal has been extensively used in neuroscience to develop theoretical frameworks, experimental approaches, and analytical techniques to understand neural control of movement, and for applications for neuro-rehabilitation. In the past decade, roboticists have successfully applied the framework of synergies to create novel design and control concepts for artificial hands, i.e., robotic hands and prostheses. At the same time, robotic research on the sensorimotor integration underlying the control and sensing of artificial hands has inspired new research approaches in neuroscience, and has provided useful instruments for novel experiments. The ambitious goal of integrating expertise and research approaches in robotics and neuroscience to study the properties and applications of the concept of synergies is generating a number of multidisciplinary cooperative projects, among which the recently finished 4-year European project “The Hand Embodied” (THE). This paper reviews the main insights provided by this framework. Specifically, we provide an overview of neuroscientific bases of hand synergies and introduce how robotics has leveraged the insights from neuroscience for innovative design in hardware and controllers for biomedical engineering applications, including myoelectric hand prostheses, devices for haptics research, and wearable sensing of human hand kinematics. The review also emphasizes how this multidisciplinary collaboration has generated new ways to conceptualize a synergy-based approach for robotics, and provides guidelines and principles for analyzing human behavior and synthesizing artificial robotic systems based on a theory of synergies. PMID:26923030

  12. Hand synergies: Integration of robotics and neuroscience for understanding the control of biological and artificial hands.

    PubMed

    Santello, Marco; Bianchi, Matteo; Gabiccini, Marco; Ricciardi, Emiliano; Salvietti, Gionata; Prattichizzo, Domenico; Ernst, Marc; Moscatelli, Alessandro; Jörntell, Henrik; Kappers, Astrid M L; Kyriakopoulos, Kostas; Albu-Schäffer, Alin; Castellini, Claudio; Bicchi, Antonio

    2016-07-01

    The term 'synergy' - from the Greek synergia - means 'working together'. The concept of multiple elements working together towards a common goal has been extensively used in neuroscience to develop theoretical frameworks, experimental approaches, and analytical techniques to understand neural control of movement, and for applications for neuro-rehabilitation. In the past decade, roboticists have successfully applied the framework of synergies to create novel design and control concepts for artificial hands, i.e., robotic hands and prostheses. At the same time, robotic research on the sensorimotor integration underlying the control and sensing of artificial hands has inspired new research approaches in neuroscience, and has provided useful instruments for novel experiments. The ambitious goal of integrating expertise and research approaches in robotics and neuroscience to study the properties and applications of the concept of synergies is generating a number of multidisciplinary cooperative projects, among which the recently finished 4-year European project "The Hand Embodied" (THE). This paper reviews the main insights provided by this framework. Specifically, we provide an overview of neuroscientific bases of hand synergies and introduce how robotics has leveraged the insights from neuroscience for innovative design in hardware and controllers for biomedical engineering applications, including myoelectric hand prostheses, devices for haptics research, and wearable sensing of human hand kinematics. The review also emphasizes how this multidisciplinary collaboration has generated new ways to conceptualize a synergy-based approach for robotics, and provides guidelines and principles for analyzing human behavior and synthesizing artificial robotic systems based on a theory of synergies. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Integrated wildfire risk assessment: framework development and application on the Lewis and Clark National Forest in Montana, USA.

    PubMed

    Thompson, Matthew P; Scott, Joe; Helmbrecht, Don; Calkin, Dave E

    2013-04-01

    The financial, socioeconomic, and ecological impacts of wildfire continue to challenge federal land management agencies in the United States. In recent years, policymakers and managers have increasingly turned to the field of risk analysis to better manage wildfires and to mitigate losses to highly valued resources and assets (HVRAs). Assessing wildfire risk entails the interaction of multiple components, including integrating wildfire simulation outputs with geospatial identification of HVRAs and the characterization of fire effects to HVRAs. We present an integrated and systematic risk assessment framework that entails 3 primary analytical components: 1) stochastic wildfire simulation and burn probability modeling to characterize wildfire hazard, 2) expert-based modeling to characterize fire effects, and 3) multicriteria decision analysis to characterize preference structures across at-risk HVRAs. We demonstrate application of this framework for a wildfire risk assessment performed on the Little Belts Assessment Area within the Lewis and Clark National Forest in Montana, United States. We devote particular attention to our approach to eliciting and encapsulating expert judgment, in which we: 1) adhered to a structured process for using expert judgment in ecological risk assessment, 2) used as our expert base local resource scientists and fire/fuels specialists who have a direct connection to the specific landscape and HVRAs in question, and 3) introduced multivariate response functions to characterize fire effects to HVRAs that consider biophysical variables beyond fire behavior. We anticipate that this work will further the state of wildfire risk science and will lead to additional application of risk assessment to inform land management planning. Copyright © 2012 SETAC.

  14. Hand synergies: Integration of robotics and neuroscience for understanding the control of biological and artificial hands

    NASA Astrophysics Data System (ADS)

    Santello, Marco; Bianchi, Matteo; Gabiccini, Marco; Ricciardi, Emiliano; Salvietti, Gionata; Prattichizzo, Domenico; Ernst, Marc; Moscatelli, Alessandro; Jörntell, Henrik; Kappers, Astrid M. L.; Kyriakopoulos, Kostas; Albu-Schäffer, Alin; Castellini, Claudio; Bicchi, Antonio

    2016-07-01

    The term 'synergy' - from the Greek synergia - means 'working together'. The concept of multiple elements working together towards a common goal has been extensively used in neuroscience to develop theoretical frameworks, experimental approaches, and analytical techniques to understand neural control of movement, and for applications for neuro-rehabilitation. In the past decade, roboticists have successfully applied the framework of synergies to create novel design and control concepts for artificial hands, i.e., robotic hands and prostheses. At the same time, robotic research on the sensorimotor integration underlying the control and sensing of artificial hands has inspired new research approaches in neuroscience, and has provided useful instruments for novel experiments. The ambitious goal of integrating expertise and research approaches in robotics and neuroscience to study the properties and applications of the concept of synergies is generating a number of multidisciplinary cooperative projects, among which the recently finished 4-year European project ;The Hand Embodied; (THE). This paper reviews the main insights provided by this framework. Specifically, we provide an overview of neuroscientific bases of hand synergies and introduce how robotics has leveraged the insights from neuroscience for innovative design in hardware and controllers for biomedical engineering applications, including myoelectric hand prostheses, devices for haptics research, and wearable sensing of human hand kinematics. The review also emphasizes how this multidisciplinary collaboration has generated new ways to conceptualize a synergy-based approach for robotics, and provides guidelines and principles for analyzing human behavior and synthesizing artificial robotic systems based on a theory of synergies.

  15. Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis

    PubMed Central

    Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748

  16. Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention

    ERIC Educational Resources Information Center

    West, Deborah; Heath, David; Huijser, Henk

    2016-01-01

    This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…

  17. Organizational Culture and Organizational Effectiveness: A Meta-Analytic Investigation of the Competing Values Framework's Theoretical Suppositions

    ERIC Educational Resources Information Center

    Hartnell, Chad A.; Ou, Amy Yi; Kinicki, Angelo

    2011-01-01

    We apply Quinn and Rohrbaugh's (1983) competing values framework (CVF) as an organizing taxonomy to meta-analytically test hypotheses about the relationship between 3 culture types and 3 major indices of organizational effectiveness (employee attitudes, operational performance [i.e., innovation and product and service quality], and financial…

  18. Unintended Revelations in History Textbooks: The Precarious Authenticity and Historical Continuity of the Slovak Nation

    ERIC Educational Resources Information Center

    Šulíková, Jana

    2016-01-01

    Purpose: This article proposes an analytical framework that helps to identify and challenge misconceptions of ethnocentrism found in pre-tertiary teaching resources for history and the social sciences in numerous countries. Design: Drawing on nationalism studies, the analytical framework employs ideas known under the umbrella terms of…

  19. Abusive supervision and subordinates' organization deviance.

    PubMed

    Tepper, Bennett J; Henle, Christine A; Lambert, Lisa Schurer; Giacalone, Robert A; Duffy, Michelle K

    2008-07-01

    The authors developed an integrated model of the relationships among abusive supervision, affective organizational commitment, norms toward organization deviance, and organization deviance and tested the framework in 2 studies: a 2-wave investigation of 243 supervised employees and a cross-sectional study of 247 employees organized into 68 work groups. Path analytic tests of mediated moderation provide support for the prediction that the mediated effect of abusive supervision on organization deviance (through affective commitment) is stronger when employees perceive that their coworkers are more approving of organization deviance (Study 1) and when coworkers perform more acts of organization deviance (Study 2).

  20. An integrated model to simulate the scattering of ultrasounds by inclusions in steels.

    PubMed

    Darmon, Michel; Calmon, Pierre; Bèle, Bertrand

    2004-04-01

    We present a study performed to model and predict the ultrasonic response of alumina inclusions in steels. The Born and the extended quasistatic approximations have been applied and modified to improve their accuracy in the framework of this application. The modified Born approximation, called "doubly distorted wave (D(2)W) Born approximation" allowing to deal with various inclusion shapes, has been selected to be implemented in the CIVA software. The model reliability has been evaluated by comparison with Ying and Truell's exact analytical solution. In parallel, measurements have been carried out upon both natural and artificial alumina inclusions.

  1. Simplex-stochastic collocation method with improved scalability

    NASA Astrophysics Data System (ADS)

    Edeling, W. N.; Dwight, R. P.; Cinnella, P.

    2016-04-01

    The Simplex-Stochastic Collocation (SSC) method is a robust tool used to propagate uncertain input distributions through a computer code. However, it becomes prohibitively expensive for problems with dimensions higher than 5. The main purpose of this paper is to identify bottlenecks, and to improve upon this bad scalability. In order to do so, we propose an alternative interpolation stencil technique based upon the Set-Covering problem, and we integrate the SSC method in the High-Dimensional Model-Reduction framework. In addition, we address the issue of ill-conditioned sample matrices, and we present an analytical map to facilitate uniformly-distributed simplex sampling.

  2. Integrating Sediment Connectivity into Water Resources Management Trough a Graph Theoretic, Stochastic Modeling Framework.

    NASA Astrophysics Data System (ADS)

    Schmitt, R. J. P.; Castelletti, A.; Bizzi, S.

    2014-12-01

    Understanding sediment transport processes at the river basin scale, their temporal spectra and spatial patterns is key to identify and minimize morphologic risks associated to channel adjustments processes. This work contributes a stochastic framework for modeling bed-load connectivity based on recent advances in the field (e.g., Bizzi & Lerner, 2013; Czubas & Foufoulas-Georgiu, 2014). It presents river managers with novel indicators from reach scale vulnerability to channel adjustment in large river networks with sparse hydrologic and sediment observations. The framework comprises three steps. First, based on a distributed hydrological model and remotely sensed information, the framework identifies a representative grain size class for each reach. Second, sediment residence time distributions are calculated for each reach in a Monte-Carlo approach applying standard sediment transport equations driven by local hydraulic conditions. Third, a network analysis defines the up- and downstream connectivity for various travel times resulting in characteristic up/downstream connectivity signatures for each reach. Channel vulnerability indicators quantify the imbalance between up/downstream connectivity for each travel time domain, representing process dependent latency of morphologic response. Last, based on the stochastic core of the model, a sensitivity analysis identifies drivers of change and major sources of uncertainty in order to target key detrimental processes and to guide effective gathering of additional data. The application, limitation and integration into a decision analytic framework is demonstrated for a major part of the Red River Basin in Northern Vietnam (179.000 km2). Here, a plethora of anthropic alterations ranging from large reservoir construction to land-use changes results in major downstream deterioration and calls for deriving concerted sediment management strategies to mitigate current and limit future morphologic alterations.

  3. The Green's matrix and the boundary integral equations for analysis of time-harmonic dynamics of elastic helical springs.

    PubMed

    Sorokin, Sergey V

    2011-03-01

    Helical springs serve as vibration isolators in virtually any suspension system. Various exact and approximate methods may be employed to determine the eigenfrequencies of vibrations of these structural elements and their dynamic transfer functions. The method of boundary integral equations is a meaningful alternative to obtain exact solutions of problems of the time-harmonic dynamics of elastic springs in the framework of Bernoulli-Euler beam theory. In this paper, the derivations of the Green's matrix, of the Somigliana's identities, and of the boundary integral equations are presented. The vibrational power transmission in an infinitely long spring is analyzed by means of the Green's matrix. The eigenfrequencies and the dynamic transfer functions are found by solving the boundary integral equations. In the course of analysis, the essential features and advantages of the method of boundary integral equations are highlighted. The reported analytical results may be used to study the time-harmonic motion in any wave guide governed by a system of linear differential equations in a single spatial coordinate along its axis. © 2011 Acoustical Society of America

  4. CancerLectinDB: a database of lectins relevant to cancer.

    PubMed

    Damodaran, Deepa; Jeyakani, Justin; Chauhan, Alok; Kumar, Nirmal; Chandra, Nagasuma R; Surolia, Avadhesha

    2008-04-01

    The role of lectins in mediating cancer metastasis, apoptosis as well as various other signaling events has been well established in the past few years. Data on various aspects of the role of lectins in cancer is being accumulated at a rapid pace. The data on lectins available in the literature is so diverse, that it becomes difficult and time-consuming, if not impossible to comprehend the advances in various areas and obtain the maximum benefit. Not only do the lectins vary significantly in their individual functional roles, but they are also diverse in their sequences, structures, binding site architectures, quaternary structures, carbohydrate affinities and specificities as well as their potential applications. An organization of these seemingly independent data into a common framework is essential in order to achieve effective use of all the data towards understanding the roles of different lectins in different aspects of cancer and any resulting applications. An integrated knowledge base (CancerLectinDB) together with appropriate analytical tools has therefore been developed for lectins relevant for any aspect of cancer, by collating and integrating diverse data. This database is unique in terms of providing sequence, structural, and functional annotations for lectins from all known sources in cancer and is expected to be a useful addition to the number of glycan related resources now available to the community. The database has been implemented using MySQL on a Linux platform and web-enabled using Perl-CGI and Java tools. Data for individual lectins pertain to taxonomic, biochemical, domain architecture, molecular sequence and structural details as well as carbohydrate specificities. Extensive links have also been provided for relevant bioinformatics resources and analytical tools. Availability of diverse data integrated into a common framework is expected to be of high value for various studies on lectin cancer biology. CancerLectinDB can be accessed through http://proline.physics.iisc.ernet.in/cancerdb .

  5. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  6. A Decision Analytic Approach to Exposure-Based Chemical Prioritization

    PubMed Central

    Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A.; Egeghy, Peter P.; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A.

    2013-01-01

    The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical’s life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664

  7. Energy-based culture medium design for biomanufacturing optimization: A case study in monoclonal antibody production by GS-NS0 cells.

    PubMed

    Quiroga-Campano, Ana L; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2018-03-02

    Demand for high-value biologics, a rapidly growing pipeline, and pressure from competition, time-to-market and regulators, necessitate novel biomanufacturing approaches, including Quality by Design (QbD) principles and Process Analytical Technologies (PAT), to facilitate accelerated, efficient and effective process development platforms that ensure consistent product quality and reduced lot-to-lot variability. Herein, QbD and PAT principles were incorporated within an innovative in vitro-in silico integrated framework for upstream process development (UPD). The central component of the UPD framework is a mathematical model that predicts dynamic nutrient uptake and average intracellular ATP content, based on biochemical reaction networks, to quantify and characterize energy metabolism and its adaptive response, metabolic shifts, to maintain ATP homeostasis. The accuracy and flexibility of the model depends on critical cell type/product/clone-specific parameters, which are experimentally estimated. The integrated in vitro-in silico platform and the model's predictive capacity reduced burden, time and expense of experimentation resulting in optimal medium design compared to commercially available culture media (80% amino acid reduction) and a fed-batch feeding strategy that increased productivity by 129%. The framework represents a flexible and efficient tool that transforms, improves and accelerates conventional process development in biomanufacturing with wide applications, including stem cell-based therapies. Copyright © 2018. Published by Elsevier Inc.

  8. The Earth Data Analytic Services (EDAS) Framework

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  9. Understanding selective molecular recognition in integrated carbon nanotube-polymer sensors by simulating physical analyte binding on carbon nanotube-polymer scaffolds.

    PubMed

    Lin, Shangchao; Zhang, Jingqing; Strano, Michael S; Blankschtein, Daniel

    2014-08-28

    Macromolecular scaffolds made of polymer-wrapped single-walled carbon nanotubes (SWCNTs) have been explored recently (Zhang et al., Nature Nanotechnology, 2013) as a new class of molecular-recognition motifs. However, selective analyte recognition is still challenging and lacks the underlying fundamental understanding needed for its practical implementation in biological sensors. In this report, we combine coarse-grained molecular dynamics (CGMD) simulations, physical adsorption/binding theories, and photoluminescence (PL) experiments to provide molecular insight into the selectivity of such sensors towards a large set of biologically important analytes. We find that the physical binding affinities of the analytes on a bare SWCNT partially correlate with their distribution coefficients in a bulk water/octanol system, suggesting that the analyte hydrophobicity plays a key role in determining the binding affinities of the analytes considered, along with the various specific interactions between the analytes and the polymer anchor groups. Two distinct categories of analytes are identified to demonstrate a complex picture for the correlation between optical sensor signals and the simulated binding affinities. Specifically, a good correlation was found between the sensor signals and the physical binding affinities of the three hormones (estradiol, melatonin, and thyroxine), the neurotransmitter (dopamine), and the vitamin (riboflavin) to the SWCNT-polymer scaffold. The four amino acids (aspartate, glycine, histidine, and tryptophan) and the two monosaccharides (fructose and glucose) considered were identified as blank analytes which are unable to induce sensor signals. The results indicate great success of our physical adsorption-based model in explaining the ranking in sensor selectivities. The combined framework presented here can be used to screen and select polymers that can potentially be used for creating synthetic molecular recognition motifs.

  10. Conceptualizing community resilience to natural hazards - the emBRACE framework

    NASA Astrophysics Data System (ADS)

    Kruse, Sylvia; Abeling, Thomas; Deeming, Hugh; Fordham, Maureen; Forrester, John; Jülich, Sebastian; Nuray Karanci, A.; Kuhlicke, Christian; Pelling, Mark; Pedoth, Lydia; Schneiderbauer, Stefan

    2017-12-01

    The level of community is considered to be vital for building disaster resilience. Yet, community resilience as a scientific concept often remains vaguely defined and lacks the guiding characteristics necessary for analysing and enhancing resilience on the ground. The emBRACE framework of community resilience presented in this paper provides a heuristic analytical tool for understanding, explaining and measuring community resilience to natural hazards. It was developed in an iterative process building on existing scholarly debates, on empirical case study work in five countries and on participatory consultation with community stakeholders where the framework was applied and ground-tested in different contexts and for different hazard types. The framework conceptualizes resilience across three core domains: (i) resources and capacities, (ii) actions and (iii) learning. These three domains are conceptualized as intrinsically conjoined within a whole. Community resilience is influenced by these integral elements as well as by extra-community forces comprising disaster risk governance and thus laws, policies and responsibilities on the one hand and on the other, the general societal context, natural and human-made disturbances and system change over time. The framework is a graphically rendered heuristic, which through application can assist in guiding the assessment of community resilience in a systematic way and identifying key drivers and barriers of resilience that affect any particular hazard-exposed community.

  11. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teeguarden, Justin G.; Tan, Yu-Mei; Edwards, Stephen W.

    Driven by major scientific advances in analytical methods, biomonitoring, and computational exposure assessment, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the computationally enabled “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) conceptmore » in the toxicological sciences. The AEP framework offers an intuitive approach to successful organization of exposure science data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathway and adverse outcome pathways, completing the source to outcome continuum and setting the stage for more efficient integration of exposure science and toxicity testing information. Together these frameworks form and inform a decision making framework with the flexibility for risk-based, hazard-based or exposure-based decisions.« less

  12. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    PubMed Central

    Teeguarden, Justin. G.; Tan, Yu-Mei; Edwards, Stephen W.; Leonard, Jeremy A.; Anderson, Kim A.; Corley, Richard A.; Harding, Anna K; Kile, Molly L.; Simonich, Staci M; Stone, David; Tanguay, Robert L.; Waters, Katrina M.; Harper, Stacey L.; Williams, David E.

    2016-01-01

    Synopsis Driven by major scientific advances in analytical methods, biomonitoring, computational tools, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the Aggregate Exposure Pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the Adverse Outcome Pathway (AOP) concept in the toxicological sciences. Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more efficient integration of exposure assessment and hazard identification. Together, the two pathways form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making. PMID:26759916

  13. A Working Framework for Enabling International Science Data System Interoperability

    NASA Astrophysics Data System (ADS)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  14. Path Integrals for Electronic Densities, Reactivity Indices, and Localization Functions in Quantum Systems

    PubMed Central

    Putz, Mihai V.

    2009-01-01

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467

  15. Path integrals for electronic densities, reactivity indices, and localization functions in quantum systems.

    PubMed

    Putz, Mihai V

    2009-11-10

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.

  16. Sustainability performance evaluation: Literature review and future directions.

    PubMed

    Büyüközkan, Gülçin; Karabulut, Yağmur

    2018-07-01

    Current global economic activities are increasingly being perceived as unsustainable. Despite the high number of publications, sustainability science remains highly dispersed over diverse approaches and topics. This article aims to provide a structured overview of sustainability performance evaluation related publications and to document the current state of literature, categorize publications, analyze and link trends, as well as highlight gaps and provide research recommendations. 128 articles between 2007 and 2018 are identified. The results suggest that sustainability performance evaluation models shall be more balanced, suitable criteria and their interrelations shall be well defined and subjectivity of qualitative criteria inherent to sustainability indicators shall be considered. To address this subjectivity, group decision-making techniques and other analytical methods that can deal with uncertainty, conflicting indicators, and linguistic evaluations can be used in future works. By presenting research gaps, this review stimulates researchers to establish practically applicable sustainability performance evaluation frameworks to help assess and compare the degree of sustainability, leading to more sustainable business practices. The review is unique in defining corporate sustainability performance evaluation for the first time, exploring the gap between sustainability accounting and sustainability assessment, and coming up with a structured overview of innovative research recommendations about integrating analytical assessment methods into conceptual sustainability frameworks. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Dissociable meta-analytic brain networks contribute to coordinated emotional processing.

    PubMed

    Riedel, Michael C; Yanes, Julio A; Ray, Kimberly L; Eickhoff, Simon B; Fox, Peter T; Sutherland, Matthew T; Laird, Angela R

    2018-06-01

    Meta-analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional- towards more network-based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta-analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta-analytic groupings of experiments demonstrating whole-brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta-analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta-analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large-scale brain networks. © 2018 Wiley Periodicals, Inc.

  18. Expanding Students' Analytical Frameworks through the Study of Graphic Novels

    ERIC Educational Resources Information Center

    Connors, Sean P.

    2015-01-01

    When teachers work with students to construct a metalanguage that they can draw on to describe and analyze graphic novels, and then invite students to apply that metalanguage in the service of composing multimodal texts of their own, teachers broaden students' analytical frameworks. In the process of doing so, teachers empower students. In this…

  19. PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving

    ERIC Educational Resources Information Center

    OECD Publishing, 2017

    2017-01-01

    What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…

  20. Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.

    PubMed

    Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie

    2017-12-01

    Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.

  1. Building on the EGIPPS performance assessment: the multipolar framework as a heuristic to tackle the complexity of performance of public service oriented health care organisations.

    PubMed

    Marchal, Bruno; Hoerée, Tom; da Silveira, Valéria Campos; Van Belle, Sara; Prashanth, Nuggehalli S; Kegels, Guy

    2014-04-17

    Performance of health care systems is a key concern of policy makers and health service managers all over the world. It is also a major challenge, given its multidimensional nature that easily leads to conceptual and methodological confusion. This is reflected by a scarcity of models that comprehensively analyse health system performance. In health, one of the most comprehensive performance frameworks was developed by the team of Leggat and Sicotte. Their framework integrates 4 key organisational functions (goal attainment, production, adaptation to the environment, and values and culture) and the tensions between these functions.We modified this framework to better fit the assessment of the performance of health organisations in the public service domain and propose an analytical strategy that takes it into the social complexity of health organisations. The resulting multipolar performance framework (MPF) is a meta-framework that facilitates the analysis of the relations and interactions between the multiple actors that influence the performance of health organisations. Using the MPF in a dynamic reiterative mode not only helps managers to identify the bottlenecks that hamper performance, but also the unintended effects and feedback loops that emerge. Similarly, it helps policymakers and programme managers at central level to better anticipate the potential results and side effects of and required conditions for health policies and programmes and to steer their implementation accordingly.

  2. Assessing coastal reclamation suitability based on a fuzzy-AHP comprehensive evaluation framework: A case study of Lianyungang, China.

    PubMed

    Feng, Lan; Zhu, Xiaodong; Sun, Xiang

    2014-12-15

    Coastal reclamation suitability evaluation (CRSE) is a difficult, complex and protracted process requiring the evaluation of many different criteria. In this paper, an integrated framework employing a fuzzy comprehensive evaluation method and analytic hierarchy process (AHP) was applied to the suitability evaluation for coastal reclamation for future sustainable development in the coastal area of Lianyungang, China. The evaluation results classified 6.63%, 22.99%, 31.59% and 38.79% of the coastline as suitable, weakly suitable, unsuitable and forbidden, respectively. The evaluation results were verified by the marine pollution data and highly consistent with the water quality status. The fuzzy-AHP comprehensive evaluation method (FACEM) was found to be suitable for the CRSE. This CRSE can also be applied to other coastal areas in China and thereby be used for the better management of coastal reclamation and coastline protection projects. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Contrasting analytical and data-driven frameworks for radiogenomic modeling of normal tissue toxicities in prostate cancer.

    PubMed

    Coates, James; Jeyaseelan, Asha K; Ybarra, Norma; David, Marc; Faria, Sergio; Souhami, Luis; Cury, Fabio; Duclos, Marie; El Naqa, Issam

    2015-04-01

    We explore analytical and data-driven approaches to investigate the integration of genetic variations (single nucleotide polymorphisms [SNPs] and copy number variations [CNVs]) with dosimetric and clinical variables in modeling radiation-induced rectal bleeding (RB) and erectile dysfunction (ED) in prostate cancer patients. Sixty-two patients who underwent curative hypofractionated radiotherapy (66 Gy in 22 fractions) between 2002 and 2010 were retrospectively genotyped for CNV and SNP rs5489 in the xrcc1 DNA repair gene. Fifty-four patients had full dosimetric profiles. Two parallel modeling approaches were compared to assess the risk of severe RB (Grade⩾3) and ED (Grade⩾1); Maximum likelihood estimated generalized Lyman-Kutcher-Burman (LKB) and logistic regression. Statistical resampling based on cross-validation was used to evaluate model predictive power and generalizability to unseen data. Integration of biological variables xrcc1 CNV and SNP improved the fit of the RB and ED analytical and data-driven models. Cross-validation of the generalized LKB models yielded increases in classification performance of 27.4% for RB and 14.6% for ED when xrcc1 CNV and SNP were included, respectively. Biological variables added to logistic regression modeling improved classification performance over standard dosimetric models by 33.5% for RB and 21.2% for ED models. As a proof-of-concept, we demonstrated that the combination of genetic and dosimetric variables can provide significant improvement in NTCP prediction using analytical and data-driven approaches. The improvement in prediction performance was more pronounced in the data driven approaches. Moreover, we have shown that CNVs, in addition to SNPs, may be useful structural genetic variants in predicting radiation toxicities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Weathering the Storm: Developing a Spatial Data Infrastructure and Online Research Platform for Oil Spill Preparedness

    NASA Astrophysics Data System (ADS)

    Bauer, J. R.; Rose, K.; Romeo, L.; Barkhurst, A.; Nelson, J.; Duran-Sesin, R.; Vielma, J.

    2016-12-01

    Efforts to prepare for and reduce the risk of hazards, from both natural and anthropogenic sources, which threaten our oceans and coasts requires an understanding of the dynamics and interactions between the physical, ecological, and socio-economic systems. Understanding these coupled dynamics are essential as offshore oil & gas exploration and production continues to push into harsher, more extreme environments where risks and uncertainty increase. However, working with these large, complex data from various sources and scales to assess risks and potential impacts associated with offshore energy exploration and production poses several challenges to research. In order to address these challenges, an integrated assessment model (IAM) was developed at the Department of Energy's (DOE) National Energy Technology Laboratory (NETL) that combines spatial data infrastructure and an online research platform to manage, process, analyze, and share these large, multidimensional datasets, research products, and the tools and models used to evaluate risk and reduce uncertainty for the entire offshore system, from the subsurface, through the water column, to coastal ecosystems and communities. Here, we will discuss the spatial data infrastructure and online research platform, NETL's Energy Data eXchange (EDX), that underpin the offshore IAM, providing information on how the framework combines multidimensional spatial data and spatio-temporal tools to evaluate risks to the complex matrix of potential environmental, social, and economic impacts stemming from modeled offshore hazard scenarios, such as oil spills or hurricanes. In addition, we will discuss the online analytics, tools, and visualization methods integrated into this framework that support availability and access to data, as well as allow for the rapid analysis and effective communication of analytical results to aid a range of decision-making needs.

  5. A hybrid design methodology for structuring an Integrated Environmental Management System (IEMS) for shipping business.

    PubMed

    Celik, Metin

    2009-03-01

    The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.

  6. Integrating succession and community assembly perspectives

    PubMed Central

    Chang, Cynthia; HilleRisLambers, Janneke

    2016-01-01

    Succession and community assembly research overlap in many respects, such as through their focus on how ecological processes like dispersal, environmental filters, and biotic interactions influence community structure. Indeed, many recent advances have been made by successional studies that draw on modern analytical techniques introduced by contemporary community assembly studies. However, community assembly studies generally lack a temporal perspective, both on how the forces structuring communities might change over time and on how historical contingency (e.g. priority effects and legacy effects) and complex transitions (e.g. threshold effects) might alter community trajectories. We believe a full understanding of the complex interacting processes that shape community dynamics across large temporal scales can best be achieved by combining concepts, tools, and study systems into an integrated conceptual framework that draws upon both succession and community assembly theory. PMID:27785355

  7. [Governance of primary health-care-based health-care organization].

    PubMed

    Báscolo, Ernesto

    2010-01-01

    An analytical framework was developed for explaining the conditions for the effectiveness of different strategies promoting integrated primary health-care (PHC) service-based systems in Latin-America. Different modes of governance (clan, incentives and hierarchy) were characterised from a political economics viewpoint for representing alternative forms of regulation promoting innovation in health-service-providing organisations. The necessary conditions for guaranteeing the modes of governance's effectiveness are presented, as are their implications in terms of posts in play. The institutional construction of an integrated health system is interpreted as being a product of a social process in which different modes of governance are combined, operating with different ways of resolving normative aspects for regulating service provision (with the hierarchical mode), resource distribution (with the incentives mode) and on the social values legitimising such process (with the clan mode).

  8. Environmental management framework for wind farm siting: methodology and case study.

    PubMed

    Tegou, Leda-Ioanna; Polatidis, Heracles; Haralambopoulos, Dias A

    2010-11-01

    This paper develops an integrated framework to evaluate land suitability for wind farm siting that combines multi-criteria analysis (MCA) with geographical information systems (GIS); an application of the proposed framework for the island of Lesvos, Greece, is further illustrated. A set of environmental, economic, social, and technical constraints, based on recent Greek legislation, identifies the potential sites for wind power installation. Furthermore, the area under consideration is evaluated by a variety of criteria, such as wind power potential, land cover type, electricity demand, visual impact, land value, and distance from the electricity grid. The pair-wise comparison method in the context of the analytic hierarchy process (AHP) is applied to estimate the criteria weights in order to establish their relative importance in site evaluation. The overall suitability of the study region for wind farm siting is appraised through the weighted summation rule. Results showed that only a very small percentage of the total area of Lesvos could be suitable for wind farm installation, although favourable wind potential exists in many more areas of the island. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. Towards an Analytical Framework for Understanding the Development of a Quality Assurance System in an International Joint Programme

    ERIC Educational Resources Information Center

    Zheng, Gaoming; Cai, Yuzhuo; Ma, Shaozhuang

    2017-01-01

    This paper intends to construct an analytical framework for understanding quality assurance in international joint programmes and to test it in a case analysis of a European--Chinese joint doctoral degree programme. The development of a quality assurance system for an international joint programme is understood as an institutionalization process…

  10. The Framework of Intervention Engine Based on Learning Analytics

    ERIC Educational Resources Information Center

    Sahin, Muhittin; Yurdugül, Halil

    2017-01-01

    Learning analytics primarily deals with the optimization of learning environments and the ultimate goal of learning analytics is to improve learning and teaching efficiency. Studies on learning analytics seem to have been made in the form of adaptation engine and intervention engine. Adaptation engine studies are quite widespread, but intervention…

  11. P3: a practice focused learning environment

    NASA Astrophysics Data System (ADS)

    Irving, Paul W.; Obsniuk, Michael J.; Caballero, Marcos D.

    2017-09-01

    There has been an increased focus on the integration of practices into physics curricula, with a particular emphasis on integrating computation into the undergraduate curriculum of scientists and engineers. In this paper, we present a university-level, introductory physics course for science and engineering majors at Michigan State University called P3 (projects and practices in physics) that is centred around providing introductory physics students with the opportunity to appropriate various science and engineering practices. The P3 design integrates computation with analytical problem solving and is built upon a curriculum foundation of problem-based learning, the principles of constructive alignment and the theoretical framework of community of practice. The design includes an innovative approach to computational physics instruction, instructional scaffolds, and a unique approach to assessment that enables instructors to guide students in the development of the practices of a physicist. We present the very positive student related outcomes of the design gathered via attitudinal and conceptual inventories and research interviews of students’ reflecting on their experiences in the P3 classroom.

  12. Integrated modeling applications for tokamak experiments with OMFIT

    NASA Astrophysics Data System (ADS)

    Meneghini, O.; Smith, S. P.; Lao, L. L.; Izacard, O.; Ren, Q.; Park, J. M.; Candy, J.; Wang, Z.; Luna, C. J.; Izzo, V. A.; Grierson, B. A.; Snyder, P. B.; Holland, C.; Penna, J.; Lu, G.; Raum, P.; McCubbin, A.; Orlov, D. M.; Belli, E. A.; Ferraro, N. M.; Prater, R.; Osborne, T. H.; Turnbull, A. D.; Staebler, G. M.

    2015-08-01

    One modeling framework for integrated tasks (OMFIT) is a comprehensive integrated modeling framework which has been developed to enable physics codes to interact in complicated workflows, and support scientists at all stages of the modeling cycle. The OMFIT development follows a unique bottom-up approach, where the framework design and capabilities organically evolve to support progressive integration of the components that are required to accomplish physics goals of increasing complexity. OMFIT provides a workflow for easily generating full kinetic equilibrium reconstructions that are constrained by magnetic and motional Stark effect measurements, and kinetic profile information that includes fast-ion pressure modeled by a transport code. It was found that magnetic measurements can be used to quantify the amount of anomalous fast-ion diffusion that is present in DIII-D discharges, and provide an estimate that is consistent with what would be needed for transport simulations to match the measured neutron rates. OMFIT was used to streamline edge-stability analyses, and evaluate the effect of resonant magnetic perturbation (RMP) on the pedestal stability, which have been found to be consistent with the experimental observations. The development of a five-dimensional numerical fluid model for estimating the effects of the interaction between magnetohydrodynamic (MHD) and microturbulence, and its systematic verification against analytic models was also supported by the framework. OMFIT was used for optimizing an innovative high-harmonic fast wave system proposed for DIII-D. For a parallel refractive index {{n}\\parallel}>3 , the conditions for strong electron-Landau damping were found to be independent of launched {{n}\\parallel} and poloidal angle. OMFIT has been the platform of choice for developing a neural-network based approach to efficiently perform a non-linear multivariate regression of local transport fluxes as a function of local dimensionless parameters. Transport predictions for thousands of DIII-D discharges showed excellent agreement with the power balance calculations across the whole plasma radius and over a broad range of operating regimes. Concerning predictive transport simulations, the framework made possible the design and automation of a workflow that enables self-consistent predictions of kinetic profiles and the plasma equilibrium. It is found that the feedback between the transport fluxes and plasma equilibrium can significantly affect the kinetic profiles predictions. Such a rich set of results provide tangible evidence of how bottom-up approaches can potentially provide a fast track to integrated modeling solutions that are functional, cost-effective, and in sync with the research effort of the community.

  13. On the performance of voltage stepping for the simulation of adaptive, nonlinear integrate-and-fire neuronal networks.

    PubMed

    Kaabi, Mohamed Ghaith; Tonnelier, Arnaud; Martinez, Dominique

    2011-05-01

    In traditional event-driven strategies, spike timings are analytically given or calculated with arbitrary precision (up to machine precision). Exact computation is possible only for simplified neuron models, mainly the leaky integrate-and-fire model. In a recent paper, Zheng, Tonnelier, and Martinez (2009) introduced an approximate event-driven strategy, named voltage stepping, that allows the generic simulation of nonlinear spiking neurons. Promising results were achieved in the simulation of single quadratic integrate-and-fire neurons. Here, we assess the performance of voltage stepping in network simulations by considering more complex neurons (quadratic integrate-and-fire neurons with adaptation) coupled with multiple synapses. To handle the discrete nature of synaptic interactions, we recast voltage stepping in a general framework, the discrete event system specification. The efficiency of the method is assessed through simulations and comparisons with a modified time-stepping scheme of the Runge-Kutta type. We demonstrated numerically that the original order of voltage stepping is preserved when simulating connected spiking neurons, independent of the network activity and connectivity.

  14. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.

  15. Intelligent manipulation technique for multi-branch robotic systems

    NASA Technical Reports Server (NTRS)

    Chen, Alexander Y. K.; Chen, Eugene Y. S.

    1990-01-01

    New analytical development in kinematics planning is reported. The INtelligent KInematics Planner (INKIP) consists of the kinematics spline theory and the adaptive logic annealing process. Also, a novel framework of robot learning mechanism is introduced. The FUzzy LOgic Self Organized Neural Networks (FULOSONN) integrates fuzzy logic in commands, control, searching, and reasoning, the embedded expert system for nominal robotics knowledge implementation, and the self organized neural networks for the dynamic knowledge evolutionary process. Progress on the mechanical construction of SRA Advanced Robotic System (SRAARS) and the real time robot vision system is also reported. A decision was made to incorporate the Local Area Network (LAN) technology in the overall communication system.

  16. The Clifford Deformation of the Hermite Semigroup

    NASA Astrophysics Data System (ADS)

    De Bie, Hendrik; Örsted, Bent; Somberg, Petr; Souček, Vladimir

    2013-02-01

    This paper is a continuation of the paper [De Bie H., Örsted B., Somberg P., Souček V., Trans. Amer. Math. Soc. 364 (2012), 3875-3902], investigating a natural radial deformation of the Fourier transform in the setting of Clifford analysis. At the same time, it gives extensions of many results obtained in [Ben Saïd S., Kobayashi T., Örsted B., Compos. Math. 148 (2012), 1265-1336]. We establish the analogues of Bochner's formula and the Heisenberg uncertainty relation in the framework of the (holomorphic) Hermite semigroup, and also give a detailed analytic treatment of the series expansion of the associated integral transform.

  17. A closed expression for the UV-divergent parts of one-loop tensor integrals in dimensional regularization

    NASA Astrophysics Data System (ADS)

    Sulyok, G.

    2017-07-01

    Starting from the general definition of a one-loop tensor N-point function, we use its Feynman parametrization to calculate the ultraviolet (UV-)divergent part of an arbitrary tensor coefficient in the framework of dimensional regularization. In contrast to existing recursion schemes, we are able to present a general analytic result in closed form that enables direct determination of the UV-divergent part of any one-loop tensor N-point coefficient independent from UV-divergent parts of other one-loop tensor N-point coefficients. Simplified formulas and explicit expressions are presented for A-, B-, C-, D-, E-, and F-functions.

  18. Furthering the Understanding of Parent–Child Relationships: A Nursing Scholarship Review Series. Part 3: Interaction and the Parent–Child Relationship—Assessment and Intervention Studies

    PubMed Central

    Pridham, Karen A.; Lutz, Kristin F.; Anderson, Lori S.; Riesch, Susan K.; Becker, Patricia T.

    2010-01-01

    PURPOSE This integrative review concerns nursing research on parent–child interaction and relationships published from 1980 through 2008 and includes assessment and intervention studies in clinically important settings (e.g., feeding, teaching, play). CONCLUSIONS Directions for research include development of theoretical frameworks, valid observational systems, and multivariate and longitudinal data analytic strategies. PRACTICE IMPLICATIONS Observation of social–emotional as well as task-related interaction qualities in the context of assessing parent–child relationships could generate new questions for nursing research and for family-centered nursing practice. PMID:20074112

  19. Connecting mathematics learning through spatial reasoning

    NASA Astrophysics Data System (ADS)

    Mulligan, Joanne; Woolcott, Geoffrey; Mitchelmore, Michael; Davis, Brent

    2018-03-01

    Spatial reasoning, an emerging transdisciplinary area of interest to mathematics education research, is proving integral to all human learning. It is particularly critical to science, technology, engineering and mathematics (STEM) fields. This project will create an innovative knowledge framework based on spatial reasoning that identifies new pathways for mathematics learning, pedagogy and curriculum. Novel analytical tools will map the unknown complex systems linking spatial and mathematical concepts. It will involve the design, implementation and evaluation of a Spatial Reasoning Mathematics Program (SRMP) in Grades 3 to 5. Benefits will be seen through development of critical spatial skills for students, increased teacher capability and informed policy and curriculum across STEM education.

  20. Assessment of Children's Psychological Development and Data Analytic Framework in New York City Infant Day Care Study.

    ERIC Educational Resources Information Center

    Golden, Mark

    This report briefly describes the procedures for assessing children's psychological development and the data analytic framework used in the New York City Infant Day Care Study. This study is a 5-year, longitudinal investigation in which infants in group and family day care programs and infants reared at home are compared. Children in the study are…

  1. Value of Flexibility - Phase 1

    DTIC Science & Technology

    2010-09-25

    weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically

  2. A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions

    PubMed Central

    Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.

    2009-01-01

    Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453

  3. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  4. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youn, H; Jeon, H; Nam, J

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law.more » In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.« less

  5. Estimating Aquifer Properties Using Sinusoidal Pumping Tests

    NASA Astrophysics Data System (ADS)

    Rasmussen, T. C.; Haborak, K. G.; Young, M. H.

    2001-12-01

    We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.

  6. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    NASA Astrophysics Data System (ADS)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data layers based on specific conditions (e.g analyze flooding risk of a property based on topography, soil ability to hold water, and forecasted precipitation) or retrieve information about locations that share similar weather and vegetation patterns during extreme weather events like heat wave.

  7. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    NASA Astrophysics Data System (ADS)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During the 6th Session of the UN-GGIM in August 2016 the role of DGGS in the context of the GSGF was formally acknowledged. This paper proposes to highlight the synergies and role of DGGS in the Global Statistical Geospatial Framework and to show examples of the use of DGGS to combine geospatial statistics with traditional geoscientific data.

  8. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  9. An integrated study for mapping the moisture distribution in an ancient damaged wall painting.

    PubMed

    Capitani, Donatella; Proietti, Noemi; Gobbino, Marco; Soroldoni, Luigi; Casellato, Umberto; Valentini, Massimo; Rosina, Elisabetta

    2009-12-01

    An integrated study of microclimate monitoring, IR thermography (IRT), gravimetric tests and portable unilateral nuclear magnetic resonance (NMR) was applied in the framework of planning emergency intervention on a very deteriorated wall painting in San Rocco church, Cornaredo (Milan, Italy). The IRT investigation supported by gravimetric tests showed that the worst damage, due to water infiltration, was localized on the wall painting of the northern wall. Unilateral NMR, a new non-destructive technique which measures the hydrogen signal of the moisture and that was applied directly to the wall, allowed a detailed map of the distribution of the moisture in the plaster underlying the wall panting to be obtained. With a proper calibration of the integral of the recorded signal with suitable specimens, each area of the map corresponded to an accurate amount of moisture. IRT, gravimetric tests and unilateral NMR applied to investigate the northern wall painting showed the presence of two wet areas separated by a dry area. The moisture found in the lower area was ascribed to the occurrence of rising damp at the bottom of the wall due to the slope of the garden soil towards the northern exterior. The moisture found in the upper area was ascribed to condensation phenomena associated with the presence of a considerable amount of soluble, hygroscopic salts. In the framework of this integrated study, IRT investigation and gravimetric methods validated portable unilateral NMR as a new analytical tool for measuring in situ and without any sampling of the distribution and amount of moisture in wall paintings.

  10. Knowledge exchange processes in organizations and policy arenas: a narrative systematic review of the literature.

    PubMed

    Contandriopoulos, Damien; Lemire, Marc; Denis, Jean-Louis; Tremblay, Emile

    2010-12-01

    This article presents the main results from a large-scale analytical systematic review on knowledge exchange interventions at the organizational and policymaking levels. The review integrated two broad traditions, one roughly focused on the use of social science research results and the other focused on policymaking and lobbying processes. Data collection was done using systematic snowball sampling. First, we used prospective snowballing to identify all documents citing any of a set of thirty-three seminal papers. This process identified 4,102 documents, 102 of which were retained for in-depth analysis. The bibliographies of these 102 documents were merged and used to identify retrospectively all articles cited five times or more and all books cited seven times or more. All together, 205 documents were analyzed. To develop an integrated model, the data were synthesized using an analytical approach. This article developed integrated conceptualizations of the forms of collective knowledge exchange systems, the nature of the knowledge exchanged, and the definition of collective-level use. This literature synthesis is organized around three dimensions of context: level of polarization (politics), cost-sharing equilibrium (economics), and institutionalized structures of communication (social structuring). The model developed here suggests that research is unlikely to provide context-independent evidence for the intrinsic efficacy of knowledge exchange strategies. To design a knowledge exchange intervention to maximize knowledge use, a detailed analysis of the context could use the kind of framework developed here. © 2010 Milbank Memorial Fund. Published by Wiley Periodicals Inc.

  11. Knowledge Exchange Processes in Organizations and Policy Arenas: A Narrative Systematic Review of the Literature

    PubMed Central

    Contandriopoulos, Damien; Lemire, Marc; Denis, Jean-Louis; Tremblay, Émile

    2010-01-01

    Context: This article presents the main results from a large-scale analytical systematic review on knowledge exchange interventions at the organizational and policymaking levels. The review integrated two broad traditions, one roughly focused on the use of social science research results and the other focused on policymaking and lobbying processes. Methods: Data collection was done using systematic snowball sampling. First, we used prospective snowballing to identify all documents citing any of a set of thirty-three seminal papers. This process identified 4,102 documents, 102 of which were retained for in-depth analysis. The bibliographies of these 102 documents were merged and used to identify retrospectively all articles cited five times or more and all books cited seven times or more. All together, 205 documents were analyzed. To develop an integrated model, the data were synthesized using an analytical approach. Findings: This article developed integrated conceptualizations of the forms of collective knowledge exchange systems, the nature of the knowledge exchanged, and the definition of collective-level use. This literature synthesis is organized around three dimensions of context: level of polarization (politics), cost-sharing equilibrium (economics), and institutionalized structures of communication (social structuring). Conclusions: The model developed here suggests that research is unlikely to provide context-independent evidence for the intrinsic efficacy of knowledge exchange strategies. To design a knowledge exchange intervention to maximize knowledge use, a detailed analysis of the context could use the kind of framework developed here. PMID:21166865

  12. Integration of Administrative, Clinical, and Environmental Data to Support the Management of Type 2 Diabetes Mellitus: From Satellites to Clinical Care.

    PubMed

    Dagliati, Arianna; Marinoni, Andrea; Cerra, Carlo; Decata, Pasquale; Chiovato, Luca; Gamba, Paolo; Bellazzi, Riccardo

    2015-12-01

    A very interesting perspective of "big data" in diabetes management stands in the integration of environmental information with data gathered for clinical and administrative purposes, to increase the capability of understanding spatial and temporal patterns of diseases. Within the MOSAIC project, funded by the European Union with the goal to design new diabetes analytics, we have jointly analyzed a clinical-administrative dataset of nearly 1.000 type 2 diabetes patients with environmental information derived from air quality maps acquired from remote sensing (satellite) data. Within this context we have adopted a general analysis framework able to deal with a large variety of temporal, geo-localized data. Thanks to the exploitation of time series analysis and satellite images processing, we studied whether glycemic control showed seasonal variations and if they have a spatiotemporal correlation with air pollution maps. We observed a link between the seasonal trends of glycated hemoglobin and air pollution in some of the considered geographic areas. Such findings will need future investigations for further confirmation. This work shows that it is possible to successfully deal with big data by implementing new analytics and how their exploration may provide new scenarios to better understand clinical phenomena. © 2015 Diabetes Technology Society.

  13. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  14. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  15. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  16. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services

    PubMed Central

    Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services. PMID:29375652

  17. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services.

    PubMed

    Chrimes, Dillon; Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services.

  18. A Temporal Map of Coaching.

    PubMed

    Theeboom, Tim; Van Vianen, Annelies E M; Beersma, Bianca

    2017-01-01

    Economic pressures on companies, technological developments, and less stable career paths pose potential threats to the well-being of employees (e.g., stress, burn-out) and require constant adaptation. In the light of these challenges, it is not surprising that employees often seek the support of a coach. The role of a coach is to foster change by facilitating a coachees' movement through a self-regulatory cycle with the ultimate aim of stimulating sustained well-being and functioning. While meta-analytic research indicates that coaching interventions can be effectively applied to assist employees in dealing with change, the current literature on coaching lacks solid theoretical frameworks that are needed to build a cumulative knowledge-base and to inspire evidence-based practice. In this conceptual analysis, we examine the coaching process through a temporal lens. By doing so, we provide an integrated theoretical framework: a temporal map of coaching. In this framework, we link seminal concepts in psychology to the coaching process, and describe which competencies of coachees are crucial in the different stages of change that coaching aims to bring about. During the preparatory contemplation stage, targeting coachees' awareness by enhancing their mindfulness and environmental receptiveness is important. During the contemplation stage, coachees' willingness and perceived ability to change are central competencies. We propose that coaches should therefore foster intrinsic goal orientation and self-efficacy during this stage. During the planning stage, coaches should focus on goal-setting and implementation intentions. Finally, during the maintenance/termination stage, stimulating coachees' reflection is especially important in order to help them to integrate their learning experiences. The framework delineated in this paper contributes to the understanding of coaching as a tool to assist employees in dealing with the challenges of an increasingly dynamic work-environment and yields concrete suggestions for future theory development and research on coaching.

  19. A Temporal Map of Coaching

    PubMed Central

    Theeboom, Tim; Van Vianen, Annelies E. M.; Beersma, Bianca

    2017-01-01

    Economic pressures on companies, technological developments, and less stable career paths pose potential threats to the well-being of employees (e.g., stress, burn-out) and require constant adaptation. In the light of these challenges, it is not surprising that employees often seek the support of a coach. The role of a coach is to foster change by facilitating a coachees’ movement through a self-regulatory cycle with the ultimate aim of stimulating sustained well-being and functioning. While meta-analytic research indicates that coaching interventions can be effectively applied to assist employees in dealing with change, the current literature on coaching lacks solid theoretical frameworks that are needed to build a cumulative knowledge-base and to inspire evidence-based practice. In this conceptual analysis, we examine the coaching process through a temporal lens. By doing so, we provide an integrated theoretical framework: a temporal map of coaching. In this framework, we link seminal concepts in psychology to the coaching process, and describe which competencies of coachees are crucial in the different stages of change that coaching aims to bring about. During the preparatory contemplation stage, targeting coachees’ awareness by enhancing their mindfulness and environmental receptiveness is important. During the contemplation stage, coachees’ willingness and perceived ability to change are central competencies. We propose that coaches should therefore foster intrinsic goal orientation and self-efficacy during this stage. During the planning stage, coaches should focus on goal-setting and implementation intentions. Finally, during the maintenance/termination stage, stimulating coachees’ reflection is especially important in order to help them to integrate their learning experiences. The framework delineated in this paper contributes to the understanding of coaching as a tool to assist employees in dealing with the challenges of an increasingly dynamic work-environment and yields concrete suggestions for future theory development and research on coaching. PMID:28848470

  20. Integration of analytical measurements and wireless communications--current issues and future strategies.

    PubMed

    Diamond, Dermot; Lau, King Tong; Brady, Sarah; Cleary, John

    2008-05-15

    Rapid developments in wireless communications are opening up opportunities for new ways to perform many types of analytical measurements that up to now have been restricted in scope due to the need to have access to centralised facilities. This paper will address both the potential for new applications and the challenges that currently inhibit more widespread integration of wireless communications with autonomous sensors and analytical devices. Key issues are identified and strategies for closer integration of analytical information and wireless communications systems discussed.

  1. Developing biodiversity indicators on a stakeholders' opinions basis: the gypsum industry Key Performance Indicators framework.

    PubMed

    Pitz, Carline; Mahy, Grégory; Vermeulen, Cédric; Marlet, Christine; Séleck, Maxime

    2016-07-01

    This study aims to establish a common Key Performance Indicators (KPIs) framework for reporting about the gypsum industry biodiversity at the European level. In order to integrate different opinions and to reach a consensus framework, an original participatory process approach has been developed among different stakeholder groups: Eurogypsum, European and regional authorities, university scientists, consulting offices, European and regional associations for the conservation of nature, and the extractive industry. The strategy is developed around four main steps: (1) building of a maximum set of indicators to be submitted to stakeholders based on the literature (Focus Group method); (2) evaluating the consensus about indicators through a policy Delphi survey aiming at the prioritization of indicator classes using the Analytic Hierarchy Process method (AHP) and of individual indicators; (3) testing acceptability and feasibility through analysis of Environmental Impact Assessments (EIAs) and visits to three European quarries; (4) Eurogypsum final decision and communication. The resulting framework contains a set of 11 indicators considered the most suitable for all the stakeholders. Our KPIs respond to European legislation and strategies for biodiversity. The framework aims at improving sustainability in quarries and at helping to manage biodiversity as well as to allow the creation of coherent reporting systems. The final goal is to allow for the definition of the actual biodiversity status of gypsum quarries and allow for enhancing it. The framework is adaptable to the local context of each gypsum quarry.

  2. Building on the EGIPPS performance assessment: the multipolar framework as a heuristic to tackle the complexity of performance of public service oriented health care organisations

    PubMed Central

    2014-01-01

    Background Performance of health care systems is a key concern of policy makers and health service managers all over the world. It is also a major challenge, given its multidimensional nature that easily leads to conceptual and methodological confusion. This is reflected by a scarcity of models that comprehensively analyse health system performance. Discussion In health, one of the most comprehensive performance frameworks was developed by the team of Leggat and Sicotte. Their framework integrates 4 key organisational functions (goal attainment, production, adaptation to the environment, and values and culture) and the tensions between these functions. We modified this framework to better fit the assessment of the performance of health organisations in the public service domain and propose an analytical strategy that takes it into the social complexity of health organisations. The resulting multipolar performance framework (MPF) is a meta-framework that facilitates the analysis of the relations and interactions between the multiple actors that influence the performance of health organisations. Summary Using the MPF in a dynamic reiterative mode not only helps managers to identify the bottlenecks that hamper performance, but also the unintended effects and feedback loops that emerge. Similarly, it helps policymakers and programme managers at central level to better anticipate the potential results and side effects of and required conditions for health policies and programmes and to steer their implementation accordingly. PMID:24742181

  3. Analytic integration of real-virtual counterterms in NNLO jet cross sections II

    NASA Astrophysics Data System (ADS)

    Bolzoni, Paolo; Moch, Sven-Olaf; Somogyi, Gábor; Trócsányi, Zoltán

    2009-08-01

    We present analytic expressions of all integrals required to complete the explicit evaluation of the real-virtual integrated counterterms needed to define a recently proposed subtraction scheme for jet cross sections at next-to-next-to-leading order in QCD. We use the Mellin-Barnes representation of these integrals in 4 - 2epsilon dimensions to obtain the coefficients of their Laurent expansions around epsilon = 0. These coefficients are given by linear combinations of multidimensional Mellin-Barnes integrals. We compute the coefficients of such expansions in epsilon both numerically and analytically by complex integration over the Mellin-Barnes contours.

  4. Completing the Link between Exposure Science and ...

    EPA Pesticide Factsheets

    Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences. Aggregate exposure pathways offer an intuitive framework to organize exposure data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathways and adverse outcome pathways, completing the source to outcome continuum for more meaningful integration of exposure assessment and hazard identification. Together, the two frameworks form and inform a decision-making framework with the flexibility for risk-based, hazard-based, or exposure-based decision making. The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports G

  5. Integrated national energy planning and management: methodology and application to Sri Lanka. World Bank technical paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munasinghe, M.; Meier, P.

    1988-01-01

    Given the importance of energy in modern economies, the first part of the volume is devoted to examining some of the key conceptual and analytical tools available for energy-policy analysis and planning. Policy tools and institutional frameworks that will facilitate better energy management are also discussed. Energy-policy analysis is explained, while effective energy management techniques are discussed to achieve desirable national objectives, using a selected set of policies and policy instruments. In the second part of the volume, the actual application of the principles set out earlier is explained through a case study of Sri Lanka. The monograph integrates themore » many aspects of the short-term programs already begun with the options for the medium to long term, and ends with the outline of a long-term strategy for Sri Lanka.« less

  6. Integrating adaptive management and ecosystem services concepts to improve natural resource management: Challenges and opportunities

    USGS Publications Warehouse

    Epanchin-Niell, Rebecca S.; Boyd, James W.; Macauley, Molly K.; Scarlett, Lynn; Shapiro, Carl D.; Williams, Byron K.

    2018-05-07

    Executive Summary—OverviewNatural resource managers must make decisions that affect broad-scale ecosystem processes involving large spatial areas, complex biophysical interactions, numerous competing stakeholder interests, and highly uncertain outcomes. Natural and social science information and analyses are widely recognized as important for informing effective management. Chief among the systematic approaches for improving the integration of science into natural resource management are two emergent science concepts, adaptive management and ecosystem services. Adaptive management (also referred to as “adaptive decision making”) is a deliberate process of learning by doing that focuses on reducing uncertainties about management outcomes and system responses to improve management over time. Ecosystem services is a conceptual framework that refers to the attributes and outputs of ecosystems (and their components and functions) that have value for humans.This report explores how ecosystem services can be moved from concept into practice through connection to a decision framework—adaptive management—that accounts for inherent uncertainties. Simultaneously, the report examines the value of incorporating ecosystem services framing and concepts into adaptive management efforts.Adaptive management and ecosystem services analyses have not typically been used jointly in decision making. However, as frameworks, they have a natural—but to date underexplored—affinity. Both are policy and decision oriented in that they attempt to represent the consequences of resource management choices on outcomes of interest to stakeholders. Both adaptive management and ecosystem services analysis take an empirical approach to the analysis of ecological systems. This systems orientation is a byproduct of the fact that natural resource actions affect ecosystems—and corresponding societal outcomes—often across large geographic scales. Moreover, because both frameworks focus on resource systems, both must confront the analytical challenges of systems modeling—in terms of complexity, dynamics, and uncertainty.Given this affinity, the integration of ecosystem services analysis and adaptive management poses few conceptual hurdles. In this report, we synthesize discussions from two workshops that considered ways in which adaptive management approaches and ecosystem service concepts may be complementary, such that integrating them into a common framework may lead to improved natural resource management outcomes. Although the literature on adaptive management and ecosystem services is vast and growing, the report focuses specifically on the integration of these two concepts rather than aiming to provide new definitions or an indepth review or primer of the concepts individually.Key issues considered include the bidirectional links between adaptive decision making and ecosystem services, as well as the potential benefits and inevitable challenges arising in the development and use of an integrated framework. Specifically, the workshops addressed the following questions:How can application of ecosystem service analysis within an adaptive decision process improve the outcomes of management and advance understanding of ecosystem service identification, production, and valuation?How can these concepts be integrated in concept and practice?What are the constraints and challenges to integrating adaptive management and ecosystem services?And, should the integration of these concepts be moved forward to wider application—and if so, how?

  7. Dynamic Digital Maps as Vehicles for Distributing Digital Geologic Maps and Embedded Analytical Data and Multimedia

    NASA Astrophysics Data System (ADS)

    Condit, C. D.; Mninch, M.

    2012-12-01

    The Dynamic Digital Map (DDM) is an ideal vehicle for the professional geologist to use to describe the geologic setting of key sites to the public in a format that integrates and presents maps and associated analytical data and multimedia without the need for an ArcGIS interface. Maps with field trip guide stops that include photographs, movies and figures and animations, showing, for example, how the features seen in the field formed, or how data might be best visualized in "time-frame" sequences are ideally included in DDMs. DDMs distribute geologic maps, images, movies, analytical data, and text such as field guides, in an integrated cross-platform, web enabled format that are intuitive to use, easily and quickly searchable, and require no additional proprietary software to operate. Maps, photos, movies and animations are stored outside the program, which acts as an organizational framework and index to present these data. Once created, the DDM can be downloaded from the web site hosting it in the flavor matching the user's operating system (e.g. Linux, Windows and Macintosh) as zip, dmg or tar files (and soon as iOS and Android tablet apps). When decompressed, the DDM can then access its associated data directly from that site with no browser needed. Alternatively, the entire package can be distributed and used from CD, DVD, or flash-memory storage. The intent of this presentation is to introduce the variety of geology that can be accessed from the over 25 DDMs created to date, concentrating on the DDM of the Springerville Volcanic Field. We will highlight selected features of some of them, introduce a simplified interface to the original DDM (that we renamed DDMC for Classic) and give a brief look at a the recently (2010-2011) completed geologic maps of the Springerville Volcanic field to see examples of each of the features discussed above, and a display of the integrated analytical data set. We will also highlight the differences between the classic or DDMCs and the new Dynamic Digital Map Extended (DDME) designed from the ground up to take advantage of the expanded connectedness this redesigned program will accommodate.

  8. Model for CO2 leakage including multiple geological layers and multiple leaky wells.

    PubMed

    Nordbotten, Jan M; Kavetski, Dmitri; Celia, Michael A; Bachu, Stefan

    2009-02-01

    Geological storage of carbon dioxide (CO2) is likely to be an integral component of any realistic plan to reduce anthropogenic greenhouse gas emissions. In conjunction with large-scale deployment of carbon storage as a technology, there is an urgent need for tools which provide reliable and quick assessments of aquifer storage performance. Previously, abandoned wells from over a century of oil and gas exploration and production have been identified as critical potential leakage paths. The practical importance of abandoned wells is emphasized by the correlation of heavy CO2 emitters (typically associated with industrialized areas) to oil and gas producing regions in North America. Herein, we describe a novel framework for predicting the leakage from large numbers of abandoned wells, forming leakage paths connecting multiple subsurface permeable formations. The framework is designed to exploit analytical solutions to various components of the problem and, ultimately, leads to a grid-free approximation to CO2 and brine leakage rates, as well as fluid distributions. We apply our model in a comparison to an established numerical solverforthe underlying governing equations. Thereafter, we demonstrate the capabilities of the model on typical field data taken from the vicinity of Edmonton, Alberta. This data set consists of over 500 wells and 7 permeable formations. Results show the flexibility and utility of the solution methods, and highlight the role that analytical and semianalytical solutions can play in this important problem.

  9. Ethics and Justice in Learning Analytics

    ERIC Educational Resources Information Center

    Johnson, Jeffrey Alan

    2017-01-01

    The many complex challenges posed by learning analytics can best be understood within a framework of structural justice, which focuses on the ways in which the informational, operational, and organizational structures of learning analytics influence students' capacities for self-development and self-determination. This places primary…

  10. Reading Multimodal Texts: Perceptual, Structural and Ideological Perspectives

    ERIC Educational Resources Information Center

    Serafini, Frank

    2010-01-01

    This article presents a tripartite framework for analyzing multimodal texts. The three analytical perspectives presented include: (1) perceptual, (2) structural, and (3) ideological analytical processes. Using Anthony Browne's picturebook "Piggybook" as an example, assertions are made regarding what each analytical perspective brings to the…

  11. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    PubMed Central

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-01-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of TOF scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (Direct Image Reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias vs. variance performance to iterative TOF reconstruction with a matched resolution model. PMID:27032968

  12. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework

    NASA Astrophysics Data System (ADS)

    Matej, Samuel; Daube-Witherspoon, Margaret E.; Karp, Joel S.

    2016-05-01

    Iterative reconstruction algorithms are routinely used for clinical practice; however, analytic algorithms are relevant candidates for quantitative research studies due to their linear behavior. While iterative algorithms also benefit from the inclusion of accurate data and noise models the widespread use of time-of-flight (TOF) scanners with less sensitivity to noise and data imperfections make analytic algorithms even more promising. In our previous work we have developed a novel iterative reconstruction approach (DIRECT: direct image reconstruction for TOF) providing convenient TOF data partitioning framework and leading to very efficient reconstructions. In this work we have expanded DIRECT to include an analytic TOF algorithm with confidence weighting incorporating models of both TOF and spatial resolution kernels. Feasibility studies using simulated and measured data demonstrate that analytic-DIRECT with appropriate resolution and regularization filters is able to provide matched bias versus variance performance to iterative TOF reconstruction with a matched resolution model.

  13. Convergence in full motion video processing, exploitation, and dissemination and activity based intelligence

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Lewis, Gina

    2012-06-01

    Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.

  14. Broadband impedance boundary conditions for the simulation of sound propagation in the time domain.

    PubMed

    Bin, Jonghoon; Yousuff Hussaini, M; Lee, Soogab

    2009-02-01

    An accurate and practical surface impedance boundary condition in the time domain has been developed for application to broadband-frequency simulation in aeroacoustic problems. To show the capability of this method, two kinds of numerical simulations are performed and compared with the analytical/experimental results: one is acoustic wave reflection by a monopole source over an impedance surface and the other is acoustic wave propagation in a duct with a finite impedance wall. Both single-frequency and broadband-frequency simulations are performed within the framework of linearized Euler equations. A high-order dispersion-relation-preserving finite-difference method and a low-dissipation, low-dispersion Runge-Kutta method are used for spatial discretization and time integration, respectively. The results show excellent agreement with the analytical/experimental results at various frequencies. The method accurately predicts both the amplitude and the phase of acoustic pressure and ensures the well-posedness of the broadband time-domain impedance boundary condition.

  15. Density functional theory for molecular and periodic systems using density fitting and continuous fast multipole method: Analytical gradients.

    PubMed

    Łazarski, Roman; Burow, Asbjörn Manfred; Grajciar, Lukáš; Sierka, Marek

    2016-10-30

    A full implementation of analytical energy gradients for molecular and periodic systems is reported in the TURBOMOLE program package within the framework of Kohn-Sham density functional theory using Gaussian-type orbitals as basis functions. Its key component is a combination of density fitting (DF) approximation and continuous fast multipole method (CFMM) that allows for an efficient calculation of the Coulomb energy gradient. For exchange-correlation part the hierarchical numerical integration scheme (Burow and Sierka, Journal of Chemical Theory and Computation 2011, 7, 3097) is extended to energy gradients. Computational efficiency and asymptotic O(N) scaling behavior of the implementation is demonstrated for various molecular and periodic model systems, with the largest unit cell of hematite containing 640 atoms and 19,072 basis functions. The overall computational effort of energy gradient is comparable to that of the Kohn-Sham matrix formation. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less

  17. Elementary Integrated Curriculum Framework

    ERIC Educational Resources Information Center

    Montgomery County Public Schools, 2010

    2010-01-01

    The Elementary Integrated Curriculum (EIC) Framework is the guiding curriculum document for the Elementary Integrated Curriculum and represents the elementary portion of the Montgomery County (Maryland) Public Schools (MCPS) Pre-K-12 Curriculum Frameworks. The EIC Framework contains the detailed indicators and objectives that describe what…

  18. The Strategic Management of Accountability in Nonprofit Organizations: An Analytical Framework.

    ERIC Educational Resources Information Center

    Kearns, Kevin P.

    1994-01-01

    Offers a framework stressing the strategic and tactical choices facing nonprofit organizations and discusses policy and management implications. Claims framework is a useful tool for conducting accountability audits and conceptual foundation for discussions of public policy. (Author/JOW)

  19. RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.

    PubMed

    Varghese, Blesson; Patel, Ishan; Barker, Adam

    2015-01-01

    Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.

  20. Analytical and numerical analysis of frictional damage in quasi brittle materials

    NASA Astrophysics Data System (ADS)

    Zhu, Q. Z.; Zhao, L. Y.; Shao, J. F.

    2016-07-01

    Frictional sliding and crack growth are two main dissipation processes in quasi brittle materials. The frictional sliding along closed cracks is the origin of macroscopic plastic deformation while the crack growth induces a material damage. The main difficulty of modeling is to consider the inherent coupling between these two processes. Various models and associated numerical algorithms have been proposed. But there are so far no analytical solutions even for simple loading paths for the validation of such algorithms. In this paper, we first present a micro-mechanical model taking into account the damage-friction coupling for a large class of quasi brittle materials. The model is formulated by combining a linear homogenization procedure with the Mori-Tanaka scheme and the irreversible thermodynamics framework. As an original contribution, a series of analytical solutions of stress-strain relations are developed for various loading paths. Based on the micro-mechanical model, two numerical integration algorithms are exploited. The first one involves a coupled friction/damage correction scheme, which is consistent with the coupling nature of the constitutive model. The second one contains a friction/damage decoupling scheme with two consecutive steps: the friction correction followed by the damage correction. With the analytical solutions as reference results, the two algorithms are assessed through a series of numerical tests. It is found that the decoupling correction scheme is efficient to guarantee a systematic numerical convergence.

  1. Curriculum Innovation for Marketing Analytics

    ERIC Educational Resources Information Center

    Wilson, Elizabeth J.; McCabe, Catherine; Smith, Robert S.

    2018-01-01

    College graduates need better preparation for and experience in data analytics for higher-quality problem solving. Using the curriculum innovation framework of Borin, Metcalf, and Tietje (2007) and case study research methods, we offer rich insights about one higher education institution's work to address the marketing analytics skills gap.…

  2. The RISE Framework: Using Learning Analytics to Automatically Identify Open Educational Resources for Continuous Improvement

    ERIC Educational Resources Information Center

    Bodily, Robert; Nyland, Rob; Wiley, David

    2017-01-01

    The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…

  3. Hidden Symmetries in String Theory

    NASA Astrophysics Data System (ADS)

    Chervonyi, Iurii

    In this thesis we study hidden symmetries within the framework of string theory. Symmetries play a very important role in physics: they lead to drastic simplifications, which allow one to compute various physical quantities without relying on perturbative techniques. There are two kinds of hidden symmetries investigated in this work: the first type is associated with dynamics of quantum fields and the second type is related to integrability of strings on various backgrounds. Integrability is a remarkable property of some theories that allows one to determine all dynamical properties of the system using purely analytical methods. The goals of this thesis are twofold: extension of hidden symmetries known in General Relativity to stringy backgrounds in higher dimensions and construction of new integrable string theories. In the context of the first goal we study hidden symmetries of stringy backgrounds, with and without supersymmetry. For supersymmetric geometries produced by D-branes we identify the backgrounds with solvable equations for geodesics, which can potentially give rise to integrable string theories. Relaxing the requirement of supersymmetry, we also study charged black holes in higher dimensions and identify their hidden symmetries encoded in so-called Killing(-Yano) tensors. We construct the explicit form of the Killing(-Yano) tensors for the charged rotating black hole in arbitrary number of dimensions, study behavior of such tensors under string dualities, and use the analysis of hidden symmetries to explain why exact solutions for black rings (black holes with non-spherical event horizons) in more than five dimensions remain elusive. As a byproduct we identify the standard parameterization of AdSp x Sq backgrounds with elliptic coordinates on a flat base. The second goal of this work is construction of new integrable string theories by applying continuous deformations of known examples. We use the recent developments called (generalized) lambda-deformation to construct new integrable backgrounds depending on several continuous parameters and study analytical properties of the such deformations.

  4. A Cameron-Storvick Theorem for Analytic Feynman Integrals on Product Abstract Wiener Space and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Jae Gil, E-mail: jgchoi@dankook.ac.kr; Chang, Seung Jun, E-mail: sejchang@dankook.ac.kr

    In this paper we derive a Cameron-Storvick theorem for the analytic Feynman integral of functionals on product abstract Wiener space B{sup 2}. We then apply our result to obtain an evaluation formula for the analytic Feynman integral of unbounded functionals on B{sup 2}. We also present meaningful examples involving functionals which arise naturally in quantum mechanics.

  5. Variational and perturbative formulations of quantum mechanical/molecular mechanical free energy with mean-field embedding and its analytical gradients.

    PubMed

    Yamamoto, Takeshi

    2008-12-28

    Conventional quantum chemical solvation theories are based on the mean-field embedding approximation. That is, the electronic wavefunction is calculated in the presence of the mean field of the environment. In this paper a direct quantum mechanical/molecular mechanical (QM/MM) analog of such a mean-field theory is formulated based on variational and perturbative frameworks. In the variational framework, an appropriate QM/MM free energy functional is defined and is minimized in terms of the trial wavefunction that best approximates the true QM wavefunction in a statistically averaged sense. Analytical free energy gradient is obtained, which takes the form of the gradient of effective QM energy calculated in the averaged MM potential. In the perturbative framework, the above variational procedure is shown to be equivalent to the first-order expansion of the QM energy (in the exact free energy expression) about the self-consistent reference field. This helps understand the relation between the variational procedure and the exact QM/MM free energy as well as existing QM/MM theories. Based on this, several ways are discussed for evaluating non-mean-field effects (i.e., statistical fluctuations of the QM wavefunction) that are neglected in the mean-field calculation. As an illustration, the method is applied to an S(N)2 Menshutkin reaction in water, NH(3)+CH(3)Cl-->NH(3)CH(3) (+)+Cl(-), for which free energy profiles are obtained at the Hartree-Fock, MP2, B3LYP, and BHHLYP levels by integrating the free energy gradient. Non-mean-field effects are evaluated to be <0.5 kcal/mol using a Gaussian fluctuation model for the environment, which suggests that those effects are rather small for the present reaction in water.

  6. Toward an in-situ analytics and diagnostics framework for earth system models

    NASA Astrophysics Data System (ADS)

    Anantharaj, Valentine; Wolf, Matthew; Rasch, Philip; Klasky, Scott; Williams, Dean; Jacob, Rob; Ma, Po-Lun; Kuo, Kwo-Sen

    2017-04-01

    The development roadmaps for many earth system models (ESM) aim for a globally cloud-resolving model targeting the pre-exascale and exascale systems of the future. The ESMs will also incorporate more complex physics, chemistry and biology - thereby vastly increasing the fidelity of the information content simulated by the model. We will then be faced with an unprecedented volume of simulation output that would need to be processed and analyzed concurrently in order to derive the valuable scientific results. We are already at this threshold with our current generation of ESMs at higher resolution simulations. Currently, the nominal I/O throughput in the Community Earth System Model (CESM) via Parallel IO (PIO) library is around 100 MB/s. If we look at the high frequency I/O requirements, it would require an additional 1 GB / simulated hour, translating to roughly 4 mins wallclock / simulated-day => 24.33 wallclock hours / simulated-model-year => 1,752,000 core-hours of charge per simulated-model-year on the Titan supercomputer at the Oak Ridge Leadership Computing Facility. There is also a pending need for 3X more volume of simulation output . Meanwhile, many ESMs use instrument simulators to run forward models to compare model simulations against satellite and ground-based instruments, such as radars and radiometers. The CFMIP Observation Simulator Package (COSP) is used in CESM as well as the Accelerated Climate Model for Energy (ACME), one of the ESMs specifically targeting current and emerging leadership-class computing platforms These simulators can be computationally expensive, accounting for as much as 30% of the computational cost. Hence the data are often written to output files that are then used for offline calculations. Again, the I/O bottleneck becomes a limitation. Detection and attribution studies also use large volume of data for pattern recognition and feature extraction to analyze weather and climate phenomenon such as tropical cyclones, atmospheric rivers, blizzards, etc. It is evident that ESMs need an in-situ framework to decouple the diagnostics and analytics from the prognostics and physics computations of the models so that the diagnostic computations could be performed concurrently without limiting model throughput. We are designing a science-driven online analytics framework for earth system models. Our approach is to adopt several data workflow technologies, such as the Adaptable IO System (ADIOS), being developed under the U.S. Exascale Computing Project (ECP) and integrate these to allow for extreme performance IO, in situ workflow integration, science-driven analytics and visualization all in a easy to use computational framework. This will allow science teams to write data 100-1000 times faster and seamlessly move from post processing the output for validation and verification purposes to performing these calculations in situ. We can easily and knowledgeably envision a near-term future where earth system models like ACME and CESM will have to address not only the challenges of the volume of data but also need to consider the velocity of the data. The earth system model of the future in the exascale era, as they incorporate more complex physics at higher resolutions, will be able to analyze more simulation content without having to compromise targeted model throughput.

  7. The Public Health Exposome: A Population-Based, Exposure Science Approach to Health Disparities Research

    PubMed Central

    Juarez, Paul D.; Matthews-Juarez, Patricia; Hood, Darryl B.; Im, Wansoo; Levine, Robert S.; Kilbourne, Barbara J.; Langston, Michael A.; Al-Hamdan, Mohammad Z.; Crosson, William L.; Estes, Maurice G.; Estes, Sue M.; Agboto, Vincent K.; Robinson, Paul; Wilson, Sacoby; Lichtveld, Maureen Y.

    2014-01-01

    The lack of progress in reducing health disparities suggests that new approaches are needed if we are to achieve meaningful, equitable, and lasting reductions. Current scientific paradigms do not adequately capture the complexity of the relationships between environment, personal health and population level disparities. The public health exposome is presented as a universal exposure tracking framework for integrating complex relationships between exogenous and endogenous exposures across the lifespan from conception to death. It uses a social-ecological framework that builds on the exposome paradigm for conceptualizing how exogenous exposures “get under the skin”. The public health exposome approach has led our team to develop a taxonomy and bioinformatics infrastructure to integrate health outcomes data with thousands of sources of exogenous exposure, organized in four broad domains: natural, built, social, and policy environments. With the input of a transdisciplinary team, we have borrowed and applied the methods, tools and terms from various disciplines to measure the effects of environmental exposures on personal and population health outcomes and disparities, many of which may not manifest until many years later. As is customary with a paradigm shift, this approach has far reaching implications for research methods and design, analytics, community engagement strategies, and research training. PMID:25514145

  8. Programming chemistry in DNA-addressable bioreactors

    PubMed Central

    Fellermann, Harold; Cardelli, Luca

    2014-01-01

    We present a formal calculus, termed the chemtainer calculus, able to capture the complexity of compartmentalized reaction systems such as populations of possibly nested vesicular compartments. Compartments contain molecular cargo as well as surface markers in the form of DNA single strands. These markers serve as compartment addresses and allow for their targeted transport and fusion, thereby enabling reactions of previously separated chemicals. The overall system organization allows for the set-up of programmable chemistry in microfluidic or other automated environments. We introduce a simple sequential programming language whose instructions are motivated by state-of-the-art microfluidic technology. Our approach integrates electronic control, chemical computing and material production in a unified formal framework that is able to mimic the integrated computational and constructive capabilities of the subcellular matrix. We provide a non-deterministic semantics of our programming language that enables us to analytically derive the computational and constructive power of our machinery. This semantics is used to derive the sets of all constructable chemicals and supermolecular structures that emerge from different underlying instruction sets. Because our proofs are constructive, they can be used to automatically infer control programs for the construction of target structures from a limited set of resource molecules. Finally, we present an example of our framework from the area of oligosaccharide synthesis. PMID:25121647

  9. Theory of precipitation effects on dead cylindrical fuels

    Treesearch

    Michael A. Fosberg

    1972-01-01

    Numerical and analytical solutions of the Fickian diffusion equation were used to determine the effects of precipitation on dead cylindrical forest fuels. The analytical solution provided a physical framework. The numerical solutions were then used to refine the analytical solution through a similarity argument. The theoretical solutions predicted realistic rates of...

  10. One- and two-center ETF-integrals of first order in relativistic calculation of NMR parameters

    NASA Astrophysics Data System (ADS)

    Slevinsky, R. M.; Temga, T.; Mouattamid, M.; Safouhi, H.

    2010-06-01

    The present work focuses on the analytical and numerical developments of first-order integrals involved in the relativistic calculation of the shielding tensor using exponential-type functions as a basis set of atomic orbitals. For the analytical development, we use the Fourier integral transformation and practical properties of spherical harmonics and the Rayleigh expansion of the plane wavefunctions. The Fourier transforms of the operators were derived in previous work and they are used for analytical development. In both the one- and two-center integrals, Cauchy's residue theorem is used in the final developments of the analytical expressions, which are shown to be accurate to machine precision.

  11. Moving beyond normative philosophies and policy concerns: a sociological account of place-based solidarities in diversity.

    PubMed

    Oosterlynck, Stijn

    2018-01-01

    In this commentary, I think with and beyond the normative philosophies and policy-oriented frameworks on how to deal with diversity in contemporary societies formulated by Zapata-Barrero and Modood. I propose to integrate elements of both perspectives in a empirically grounded sociological account of how place-based solidarities in diversity are nurtured in everyday life. Although there is much to be recommended about the arguments of Modood and Zapata-Barrero, I argue that what is needed is an analytical framework that does not a priori privilege specific sources of solidarity on normative-philosophical or policy grounds. We need to focus instead on how people mobilise different sources of solidarity in their attempts to take shared responsibility for the concrete places where they live, work, learn and play together in superdiversity. This micro-level focus does not mean that one ignores macro-level processes. Also, more attention should be paid to the transformative nature of solidarities in diversity.

  12. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region

    PubMed Central

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-01

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management. PMID:29342852

  13. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region.

    PubMed

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-13

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management.

  14. Integrated sudomotor axon reflex sweat stimulation for continuous sweat analyte analysis with individuals at rest.

    PubMed

    Sonner, Zachary; Wilder, Eliza; Gaillard, Trudy; Kasting, Gerald; Heikenfeld, Jason

    2017-07-25

    Eccrine sweat has rapidly emerged as a non-invasive, ergonomic, and rich source of chemical analytes with numerous technological demonstrations now showing the ability for continuous electrochemical sensing. However, beyond active perspirers (athletes, workers, etc.), continuous sweat access in individuals at rest has hindered the advancement of both sweat sensing science and technology. Reported here is integration of sudomotor axon reflex sweat stimulation for continuous wearable sweat analyte analysis, including the ability for side-by-side integration of chemical stimulants & sensors without cross-contamination. This integration approach is uniquely compatible with sensors which consume the analyte (enzymatic) or sensors which equilibrate with analyte concentrations. In vivo validation is performed using iontophoretic delivery of carbachol with ion-selective and impedance sensors for sweat analysis. Carbachol has shown prolonged sweat stimulation in directly stimulated regions for five hours or longer. This work represents a significant leap forward in sweat sensing technology, and may be of broader interest to those interested in on-skin sensing integrated with drug-delivery.

  15. Responsible research and innovation indicators for science education assessment: how to measure the impact?

    NASA Astrophysics Data System (ADS)

    Heras, Maria; Ruiz-Mallén, Isabel

    2017-12-01

    The emerging paradigm of responsible research and innovation (RRI) in the European Commission policy discourse identifies science education as a key agenda for better equipping students with skills and knowledge to tackle complex societal challenges and foster active citizenship in democratic societies. The operationalisation of this broad approach in science education demands, however, the identification of assessment frameworks able to grasp the complexity of RRI process requirements and learning outcomes within science education practice. This article aims to shed light over the application of the RRI approach in science education by proposing a RRI-based analytical framework for science education assessment. We use such framework to review a sample of empirical studies of science education assessments and critically analyse it under the lenses of RRI criteria. As a result, we identify a set of 86 key RRI assessment indicators in science education related to RRI values, transversal competences and experiential and cognitive aspects of learning. We argue that looking at science education through the lenses of RRI can potentially contribute to the integration of metacognitive skills, emotional aspects and procedural dimensions within impact assessments so as to address the complexity of learning.

  16. Engaging policy makers in road safety research in Malaysia: a theoretical and contextual analysis.

    PubMed

    Tran, Nhan T; Hyder, Adnan A; Kulanthayan, Subramaniam; Singh, Suret; Umar, R S Radin

    2009-04-01

    Road traffic injuries (RTIs) are a growing public health problem that must be addressed through evidence-based interventions including policy-level changes such as the enactment of legislation to mandate specific behaviors and practices. Policy makers need to be engaged in road safety research to ensure that road safety policies are grounded in scientific evidence. This paper examines the strategies used to engage policy makers and other stakeholder groups and discusses the challenges that result from a multi-disciplinary, inter-sectoral collaboration. A framework for engaging policy makers in research was developed and applied to describe an example of collective road safety research in Malaysia. Key components of this framework include readiness, assessment, planning, implementation/evaluation, and policy development/sustainability. The case study of a collaborative intervention trial for the prevention of motorcycle crashes and deaths in Malaysia serves as a model for policy engagement by road safety and injury researchers. The analytic description of this research process in Malaysia demonstrates that the framework, through its five stages, can be used as a tool to guide the integration of needed research evidence into policy for road safety and injury prevention.

  17. High Z neoclassical transport: Application and limitation of analytical formulae for modelling JET experimental parameters

    NASA Astrophysics Data System (ADS)

    Breton, S.; Casson, F. J.; Bourdelle, C.; Angioni, C.; Belli, E.; Camenen, Y.; Citrin, J.; Garbet, X.; Sarazin, Y.; Sertoli, M.; JET Contributors

    2018-01-01

    Heavy impurities, such as tungsten (W), can exhibit strongly poloidally asymmetric density profiles in rotating or radio frequency heated plasmas. In the metallic environment of JET, the poloidal asymmetry of tungsten enhances its neoclassical transport up to an order of magnitude, so that neoclassical convection dominates over turbulent transport in the core. Accounting for asymmetries in neoclassical transport is hence necessary in the integrated modeling framework. The neoclassical drift kinetic code, NEO [E. Belli and J. Candy, Plasma Phys. Controlled Fusion P50, 095010 (2008)], includes the impact of poloidal asymmetries on W transport. However, the computational cost required to run NEO slows down significantly integrated modeling. A previous analytical formulation to describe heavy impurity neoclassical transport in the presence of poloidal asymmetries in specific collisional regimes [C. Angioni and P. Helander, Plasma Phys. Controlled Fusion 56, 124001 (2014)] is compared in this work to numerical results from NEO. Within the domain of validity of the formula, the factor for reducing the temperature screening due to poloidal asymmetries had to be empirically adjusted. After adjustment, the modified formula can reproduce NEO results outside of its definition domain, with some limitations: When main ions are in the banana regime, the formula reproduces NEO results whatever the collisionality regime of impurities, provided that the poloidal asymmetry is not too large. However, for very strong poloidal asymmetries, agreement requires impurities in the Pfirsch-Schlüter regime. Within the JETTO integrated transport code, the analytical formula combined with the poloidally symmetric neoclassical code NCLASS [W. A. Houlberg et al., Phys. Plasmas 4, 3230 (1997)] predicts the same tungsten profile as NEO in certain cases, while saving a factor of one thousand in computer time, which can be useful in scoping studies. The parametric dependencies of the temperature screening reduction due to poloidal asymmetries would need to be better characterised for this faster model to be extended to a more general applicability.

  18. Benefits of coastal recreation in Europe: identifying trade-offs and priority regions for sustainable management.

    PubMed

    Ghermandi, Andrea

    2015-04-01

    This paper examines the welfare dimension of the recreational services of coastal ecosystems through the application of a meta-analytical value transfer framework, which integrates Geographic Information Systems (GIS) for the characterization of climate, biodiversity, accessibility, and anthropogenic pressure in each of 368 regions of the European coastal zone. The relative contribution of international, domestic, and local recreationists to aggregated regional values is examined. The implications of the analysis for prioritization of conservation areas and identification of good management practices are highlighted through the comparative assessment of estimated recreation values, current environmental pressures, and existing network of protected sites. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Grid Stability Awareness System (GSAS) Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feuerborn, Scott; Ma, Jian; Black, Clifton

    The project team developed a software suite named Grid Stability Awareness System (GSAS) for power system near real-time stability monitoring and analysis based on synchrophasor measurement. The software suite consists of five analytical tools: an oscillation monitoring tool, a voltage stability monitoring tool, a transient instability monitoring tool, an angle difference monitoring tool, and an event detection tool. These tools have been integrated into one framework to provide power grid operators with both real-time or near real-time stability status of a power grid and historical information about system stability status. These tools are being considered for real-time use in themore » operation environment.« less

  20. Nonperturbative quark-gluon thermodynamics at finite density

    NASA Astrophysics Data System (ADS)

    Andreichikov, M. A.; Lukashov, M. S.; Simonov, Yu. A.

    2018-03-01

    Thermodynamics of the quark-gluon plasma at finite density is studied in the framework of the Field Correlator Method, where thermodynamical effects of Polyakov loops and color magnetic confinement are taken into account. Having found good agreement with numerical lattice data for zero density, we calculate pressure P(T,μ), for 0 < μ < 400 MeV and 150 < T < 1000 MeV. For the first time, the explicit integral form is found in this region, demonstrating analytic structure in the complex μ plane. The resulting multiple complex branch points are found at the Roberge-Weiss values of Imμ, with Reμ defined by the values of Polyakov lines and color magnetic confinement.

  1. On integrating Jungian and other theories.

    PubMed

    Sedgwick, David

    2015-09-01

    This paper consists of reflections on some of the processes, subtleties, and 'eros' involved in attempting to integrate Jungian and other analytic perspectives. Assimilation of other theoretical viewpoints has a long history in analytical psychology, beginning when Jung met Freud. Since its inception, the Journal of Analytical Psychology has provided a forum for theoretical syntheses and comparative psychoanalysis. Such attempts at synthesizing other theories represent analytical psychology itself trying to individuate. © 2015, The Society of Analytical Psychology.

  2. Open-source Framework for Storing and Manipulation of Plasma Chemical Reaction Data

    NASA Astrophysics Data System (ADS)

    Jenkins, T. G.; Averkin, S. N.; Cary, J. R.; Kruger, S. E.

    2017-10-01

    We present a new open-source framework for storage and manipulation of plasma chemical reaction data that has emerged from our in-house project MUNCHKIN. This framework consists of python scripts and C + + programs. It stores data in an SQL data base for fast retrieval and manipulation. For example, it is possible to fit cross-section data into most widely used analytical expressions, calculate reaction rates for Maxwellian distribution functions of colliding particles, and fit them into different analytical expressions. Another important feature of this framework is the ability to calculate transport properties based on the cross-section data and supplied distribution functions. In addition, this framework allows the export of chemical reaction descriptions in LaTeX format for ease of inclusion in scientific papers. With the help of this framework it is possible to generate corresponding VSim (Particle-In-Cell simulation code) and USim (unstructured multi-fluid code) input blocks with appropriate cross-sections.

  3. Electrocardiographic interpretation skills of cardiology residents: are they competent?

    PubMed

    Sibbald, Matthew; Davies, Edward G; Dorian, Paul; Yu, Eric H C

    2014-12-01

    Achieving competency at electrocardiogram (ECG) interpretation among cardiology subspecialty residents has traditionally focused on interpreting a target number of ECGs during training. However, there is little evidence to support this approach. Further, there are no data documenting the competency of ECG interpretation skills among cardiology residents, who become de facto the gold standard in their practice communities. We tested 29 Cardiology residents from all 3 years in a large training program using a set of 20 ECGs collected from a community cardiology practice over a 1-month period. Residents interpreted half of the ECGs using a standard analytic framework, and half using their own approach. Residents were scored on the number of correct and incorrect diagnoses listed. Overall diagnostic accuracy was 58%. Of 6 potentially life-threatening diagnoses, residents missed 36% (123 of 348) including hyperkalemia (81%), long QT (52%), complete heart block (35%), and ventricular tachycardia (19%). Residents provided additional inappropriate diagnoses on 238 ECGs (41%). Diagnostic accuracy was similar between ECGs interpreted using an analytic framework vs ECGs interpreted without an analytic framework (59% vs 58%; F(1,1333) = 0.26; P = 0.61). Cardiology resident proficiency at ECG interpretation is suboptimal. Despite the use of an analytic framework, there remain significant deficiencies in ECG interpretation among Cardiology residents. A more systematic method of addressing these important learning gaps is urgently needed. Copyright © 2014 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  4. The general 2-D moments via integral transform method for acoustic radiation and scattering

    NASA Astrophysics Data System (ADS)

    Smith, Jerry R.; Mirotznik, Mark S.

    2004-05-01

    The moments via integral transform method (MITM) is a technique to analytically reduce the 2-D method of moments (MoM) impedance double integrals into single integrals. By using a special integral representation of the Green's function, the impedance integral can be analytically simplified to a single integral in terms of transformed shape and weight functions. The reduced expression requires fewer computations and reduces the fill times of the MoM impedance matrix. Furthermore, the resulting integral is analytic for nearly arbitrary shape and weight function sets. The MITM technique is developed for mixed boundary conditions and predictions with basic shape and weight function sets are presented. Comparisons of accuracy and speed between MITM and brute force are presented. [Work sponsored by ONR and NSWCCD ILIR Board.

  5. Complete characterization of fourth-order symplectic integrators with extended-linear coefficients.

    PubMed

    Chin, Siu A

    2006-02-01

    The structure of symplectic integrators up to fourth order can be completely and analytically understood when the factorization (split) coefficients are related linearly but with a uniform nonlinear proportional factor. The analytic form of these extended-linear symplectic integrators greatly simplified proofs of their general properties and allowed easy construction of both forward and nonforward fourth-order algorithms with an arbitrary number of operators. Most fourth-order forward integrators can now be derived analytically from this extended-linear formulation without the use of symbolic algebra.

  6. The NIH analytical methods and reference materials program for dietary supplements.

    PubMed

    Betz, Joseph M; Fisher, Kenneth D; Saldanha, Leila G; Coates, Paul M

    2007-09-01

    Quality of botanical products is a great uncertainty that consumers, clinicians, regulators, and researchers face. Definitions of quality abound, and include specifications for sanitation, adventitious agents (pesticides, metals, weeds), and content of natural chemicals. Because dietary supplements (DS) are often complex mixtures, they pose analytical challenges and method validation may be difficult. In response to product quality concerns and the need for validated and publicly available methods for DS analysis, the US Congress directed the Office of Dietary Supplements (ODS) at the National Institutes of Health (NIH) to accelerate an ongoing methods validation process, and the Dietary Supplements Methods and Reference Materials Program was created. The program was constructed from stakeholder input and incorporates several federal procurement and granting mechanisms in a coordinated and interlocking framework. The framework facilitates validation of analytical methods, analytical standards, and reference materials.

  7. Combined quality function deployment and logical framework analysis to improve quality of emergency care in Malta.

    PubMed

    Buttigieg, Sandra Catherine; Dey, Prasanta Kumar; Cassar, Mary Rose

    2016-01-01

    The purpose of this paper is to develop an integrated patient-focused analytical framework to improve quality of care in accident and emergency (A & E) unit of a Maltese hospital. The study adopts a case study approach. First, a thorough literature review has been undertaken to study the various methods of healthcare quality management. Second, a healthcare quality management framework is developed using combined quality function deployment (QFD) and logical framework approach (LFA). Third, the proposed framework is applied to a Maltese hospital to demonstrate its effectiveness. The proposed framework has six steps, commencing with identifying patients' requirements and concluding with implementing improvement projects. All the steps have been undertaken with the involvement of the concerned stakeholders in the A & E unit of the hospital. The major and related problems being faced by the hospital under study were overcrowding at A & E and shortage of beds, respectively. The combined framework ensures better A & E services and patient flow. QFD identifies and analyses the issues and challenges of A & E and LFA helps develop project plans for healthcare quality improvement. The important outcomes of implementing the proposed quality improvement programme are fewer hospital admissions, faster patient flow, expert triage and shorter waiting times at the A & E unit. Increased emergency consultant cover and faster first significant medical encounter were required to start addressing the problems effectively. Overall, the combined QFD and LFA method is effective to address quality of care in A & E unit. PRACTICAL/IMPLICATIONS: The proposed framework can be easily integrated within any healthcare unit, as well as within entire healthcare systems, due to its flexible and user-friendly approach. It could be part of Six Sigma and other quality initiatives. Although QFD has been extensively deployed in healthcare setup to improve quality of care, very little has been researched on combining QFD and LFA in order to identify issues, prioritise them, derive improvement measures and implement improvement projects. Additionally, there is no research on QFD application in A & E. This paper bridges these gaps. Moreover, very little has been written on the Maltese health care system. Therefore, this study contributes demonstration of quality of emergency care in Malta.

  8. A parameter optimization approach to controller partitioning for integrated flight/propulsion control application

    NASA Technical Reports Server (NTRS)

    Schmidt, Phillip; Garg, Sanjay; Holowecky, Brian

    1992-01-01

    A parameter optimization framework is presented to solve the problem of partitioning a centralized controller into a decentralized hierarchical structure suitable for integrated flight/propulsion control implementation. The controller partitioning problem is briefly discussed and a cost function to be minimized is formulated, such that the resulting 'optimal' partitioned subsystem controllers will closely match the performance (including robustness) properties of the closed-loop system with the centralized controller while maintaining the desired controller partitioning structure. The cost function is written in terms of parameters in a state-space representation of the partitioned sub-controllers. Analytical expressions are obtained for the gradient of this cost function with respect to parameters, and an optimization algorithm is developed using modern computer-aided control design and analysis software. The capabilities of the algorithm are demonstrated by application to partitioned integrated flight/propulsion control design for a modern fighter aircraft in the short approach to landing task. The partitioning optimization is shown to lead to reduced-order subcontrollers that match the closed-loop command tracking and decoupling performance achieved by a high-order centralized controller.

  9. A parameter optimization approach to controller partitioning for integrated flight/propulsion control application

    NASA Technical Reports Server (NTRS)

    Schmidt, Phillip H.; Garg, Sanjay; Holowecky, Brian R.

    1993-01-01

    A parameter optimization framework is presented to solve the problem of partitioning a centralized controller into a decentralized hierarchical structure suitable for integrated flight/propulsion control implementation. The controller partitioning problem is briefly discussed and a cost function to be minimized is formulated, such that the resulting 'optimal' partitioned subsystem controllers will closely match the performance (including robustness) properties of the closed-loop system with the centralized controller while maintaining the desired controller partitioning structure. The cost function is written in terms of parameters in a state-space representation of the partitioned sub-controllers. Analytical expressions are obtained for the gradient of this cost function with respect to parameters, and an optimization algorithm is developed using modern computer-aided control design and analysis software. The capabilities of the algorithm are demonstrated by application to partitioned integrated flight/propulsion control design for a modern fighter aircraft in the short approach to landing task. The partitioning optimization is shown to lead to reduced-order subcontrollers that match the closed-loop command tracking and decoupling performance achieved by a high-order centralized controller.

  10. Visual Analytics of integrated Data Systems for Space Weather Purposes

    NASA Astrophysics Data System (ADS)

    Rosa, Reinaldo; Veronese, Thalita; Giovani, Paulo

    Analysis of information from multiple data sources obtained through high resolution instrumental measurements has become a fundamental task in all scientific areas. The development of expert methods able to treat such multi-source data systems, with both large variability and measurement extension, is a key for studying complex scientific phenomena, especially those related to systemic analysis in space and environmental sciences. In this talk, we present a time series generalization introducing the concept of generalized numerical lattice, which represents a discrete sequence of temporal measures for a given variable. In this novel representation approach each generalized numerical lattice brings post-analytical data information. We define a generalized numerical lattice as a set of three parameters representing the following data properties: dimensionality, size and post-analytical measure (e.g., the autocorrelation, Hurst exponent, etc)[1]. From this representation generalization, any multi-source database can be reduced to a closed set of classified time series in spatiotemporal generalized dimensions. As a case study, we show a preliminary application in space science data, highlighting the possibility of a real time analysis expert system. In this particular application, we have selected and analyzed, using detrended fluctuation analysis (DFA), several decimetric solar bursts associated to X flare-classes. The association with geomagnetic activity is also reported. DFA method is performed in the framework of a radio burst automatic monitoring system. Our results may characterize the variability pattern evolution, computing the DFA scaling exponent, scanning the time series by a short windowing before the extreme event [2]. For the first time, the application of systematic fluctuation analysis for space weather purposes is presented. The prototype for visual analytics is implemented in a Compute Unified Device Architecture (CUDA) by using the K20 Nvidia graphics processing units (GPUs) to reduce the integrated analysis runtime. [1] Veronese et al. doi: 10.6062/jcis.2009.01.02.0021, 2010. [2] Veronese et al. doi:http://dx.doi.org/10.1016/j.jastp.2010.09.030, 2011.

  11. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  12. LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich

    2016-01-01

    To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…

  13. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  14. A framework for in vitro systems toxicology assessment of e-liquids

    PubMed Central

    Iskandar, Anita R.; Gonzalez-Suarez, Ignacio; Majeed, Shoaib; Marescotti, Diego; Sewer, Alain; Xiang, Yang; Leroy, Patrice; Guedj, Emmanuel; Mathis, Carole; Schaller, Jean-Pierre; Vanscheeuwijck, Patrick; Frentzel, Stefan; Martin, Florian; Ivanov, Nikolai V.; Peitsch, Manuel C.; Hoeng, Julia

    2016-01-01

    Abstract Various electronic nicotine delivery systems (ENDS), of which electronic cigarettes (e-cigs) are the most recognized prototype, have been quickly gaining ground on conventional cigarettes because they are perceived as less harmful. Research assessing the potential effects of ENDS exposure in humans is currently limited and inconclusive. New products are emerging with numerous variations in designs and performance parameters within and across brands. Acknowledging these challenges, we present here a proposed framework for an in vitro systems toxicology assessment of e-liquids and their aerosols, intended to complement the battery of assays for standard toxicity assessments. The proposed framework utilizes high-throughput toxicity assessments of e-liquids and their aerosols, in which the device-to-device variability is minimized, and a systems-level investigation of the cellular mechanisms of toxicity is an integral part. An analytical chemistry investigation is also included as a part of the framework to provide accurate and reliable chemistry data solidifying the toxicological assessment. In its simplest form, the framework comprises of three main layers: (1) high-throughput toxicity screening of e-liquids using primary human cell culture systems; (2) toxicity-related mechanistic assessment of selected e-liquids, and (3) toxicity-related mechanistic assessment of their aerosols using organotypic air–liquid interface airway culture systems. A systems toxicology assessment approach is leveraged to enable in-depth analyses of the toxicity-related cellular mechanisms of e-liquids and their aerosols. We present example use cases to demonstrate the suitability of the framework for a robust in vitro assessment of e-liquids and their aerosols. PMID:27117495

  15. A framework for in vitro systems toxicology assessment of e-liquids.

    PubMed

    Iskandar, Anita R; Gonzalez-Suarez, Ignacio; Majeed, Shoaib; Marescotti, Diego; Sewer, Alain; Xiang, Yang; Leroy, Patrice; Guedj, Emmanuel; Mathis, Carole; Schaller, Jean-Pierre; Vanscheeuwijck, Patrick; Frentzel, Stefan; Martin, Florian; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia

    2016-07-01

    Various electronic nicotine delivery systems (ENDS), of which electronic cigarettes (e-cigs) are the most recognized prototype, have been quickly gaining ground on conventional cigarettes because they are perceived as less harmful. Research assessing the potential effects of ENDS exposure in humans is currently limited and inconclusive. New products are emerging with numerous variations in designs and performance parameters within and across brands. Acknowledging these challenges, we present here a proposed framework for an in vitro systems toxicology assessment of e-liquids and their aerosols, intended to complement the battery of assays for standard toxicity assessments. The proposed framework utilizes high-throughput toxicity assessments of e-liquids and their aerosols, in which the device-to-device variability is minimized, and a systems-level investigation of the cellular mechanisms of toxicity is an integral part. An analytical chemistry investigation is also included as a part of the framework to provide accurate and reliable chemistry data solidifying the toxicological assessment. In its simplest form, the framework comprises of three main layers: (1) high-throughput toxicity screening of e-liquids using primary human cell culture systems; (2) toxicity-related mechanistic assessment of selected e-liquids, and (3) toxicity-related mechanistic assessment of their aerosols using organotypic air-liquid interface airway culture systems. A systems toxicology assessment approach is leveraged to enable in-depth analyses of the toxicity-related cellular mechanisms of e-liquids and their aerosols. We present example use cases to demonstrate the suitability of the framework for a robust in vitro assessment of e-liquids and their aerosols.

  16. Analytical close-form solutions to the elastic fields of solids with dislocations and surface stress

    NASA Astrophysics Data System (ADS)

    Ye, Wei; Paliwal, Bhasker; Ougazzaden, Abdallah; Cherkaoui, Mohammed

    2013-07-01

    The concept of eigenstrain is adopted to derive a general analytical framework to solve the elastic field for 3D anisotropic solids with general defects by considering the surface stress. The formulation shows the elastic constants and geometrical features of the surface play an important role in determining the elastic fields of the solid. As an application, the analytical close-form solutions to the stress fields of an infinite isotropic circular nanowire are obtained. The stress fields are compared with the classical solutions and those of complex variable method. The stress fields from this work demonstrate the impact from the surface stress when the size of the nanowire shrinks but becomes negligible in macroscopic scale. Compared with the power series solutions of complex variable method, the analytical solutions in this work provide a better platform and they are more flexible in various applications. More importantly, the proposed analytical framework profoundly improves the studies of general 3D anisotropic materials with surface effects.

  17. SOCR data dashboard: an integrated big data archive mashing medicare, labor, census and econometric information.

    PubMed

    Husain, Syed S; Kalinin, Alexandr; Truong, Anh; Dinov, Ivo D

    Intuitive formulation of informative and computationally-efficient queries on big and complex datasets present a number of challenges. As data collection is increasingly streamlined and ubiquitous, data exploration, discovery and analytics get considerably harder. Exploratory querying of heterogeneous and multi-source information is both difficult and necessary to advance our knowledge about the world around us. We developed a mechanism to integrate dispersed multi-source data and service the mashed information via human and machine interfaces in a secure, scalable manner. This process facilitates the exploration of subtle associations between variables, population strata, or clusters of data elements, which may be opaque to standard independent inspection of the individual sources. This a new platform includes a device agnostic tool (Dashboard webapp, http://socr.umich.edu/HTML5/Dashboard/) for graphical querying, navigating and exploring the multivariate associations in complex heterogeneous datasets. The paper illustrates this core functionality and serviceoriented infrastructure using healthcare data (e.g., US data from the 2010 Census, Demographic and Economic surveys, Bureau of Labor Statistics, and Center for Medicare Services) as well as Parkinson's Disease neuroimaging data. Both the back-end data archive and the front-end dashboard interfaces are continuously expanded to include additional data elements and new ways to customize the human and machine interactions. A client-side data import utility allows for easy and intuitive integration of user-supplied datasets. This completely open-science framework may be used for exploratory analytics, confirmatory analyses, meta-analyses, and education and training purposes in a wide variety of fields.

  18. Caring for nanotechnology? Being an integrated social scientist.

    PubMed

    Viseu, Ana

    2015-10-01

    One of the most significant shifts in science policy of the past three decades is a concern with extending scientific practice to include a role for 'society'. Recently, this has led to legislative calls for the integration of the social sciences and humanities in publicly funded research and development initiatives. In nanotechnology--integration's primary field site--this policy has institutionalized the practice of hiring social scientists in technical facilities. Increasingly mainstream, the workings and results of this integration mechanism remain understudied. In this article, I build upon my three-year experience as the in-house social scientist at the Cornell NanoScale Facility and the United States' National Nanotechnology Infrastructure Network to engage empirically and conceptually with this mode of governance in nanotechnology. From the vantage point of the integrated social scientist, I argue that in its current enactment, integration emerges as a particular kind of care work, with social scientists being fashioned as the main caretakers. Examining integration as a type of care practice and as a 'matter of care' allows me to highlight the often invisible, existential, epistemic, and affective costs of care as governance. Illuminating a framework where social scientists are called upon to observe but not disturb, to reify boundaries rather than blur them, this article serves as a word of caution against integration as a novel mode of governance that seemingly privileges situatedness, care, and entanglement, moving us toward an analytically skeptical (but not dismissive) perspective on integration.

  19. A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.

    2013-01-01

    This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…

  20. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    ERIC Educational Resources Information Center

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  1. Applied immuno-epidemiological research: an approach for integrating existing knowledge into the statistical analysis of multiple immune markers.

    PubMed

    Genser, Bernd; Fischer, Joachim E; Figueiredo, Camila A; Alcântara-Neves, Neuza; Barreto, Mauricio L; Cooper, Philip J; Amorim, Leila D; Saemann, Marcus D; Weichhart, Thomas; Rodrigues, Laura C

    2016-05-20

    Immunologists often measure several correlated immunological markers, such as concentrations of different cytokines produced by different immune cells and/or measured under different conditions, to draw insights from complex immunological mechanisms. Although there have been recent methodological efforts to improve the statistical analysis of immunological data, a framework is still needed for the simultaneous analysis of multiple, often correlated, immune markers. This framework would allow the immunologists' hypotheses about the underlying biological mechanisms to be integrated. We present an analytical approach for statistical analysis of correlated immune markers, such as those commonly collected in modern immuno-epidemiological studies. We demonstrate i) how to deal with interdependencies among multiple measurements of the same immune marker, ii) how to analyse association patterns among different markers, iii) how to aggregate different measures and/or markers to immunological summary scores, iv) how to model the inter-relationships among these scores, and v) how to use these scores in epidemiological association analyses. We illustrate the application of our approach to multiple cytokine measurements from 818 children enrolled in a large immuno-epidemiological study (SCAALA Salvador), which aimed to quantify the major immunological mechanisms underlying atopic diseases or asthma. We demonstrate how to aggregate systematically the information captured in multiple cytokine measurements to immunological summary scores aimed at reflecting the presumed underlying immunological mechanisms (Th1/Th2 balance and immune regulatory network). We show how these aggregated immune scores can be used as predictors in regression models with outcomes of immunological studies (e.g. specific IgE) and compare the results to those obtained by a traditional multivariate regression approach. The proposed analytical approach may be especially useful to quantify complex immune responses in immuno-epidemiological studies, where investigators examine the relationship among epidemiological patterns, immune response, and disease outcomes.

  2. A new and integrated hydro-economic accounting and analytical framework for water resources: a case study for North China.

    PubMed

    Guan, Dabo; Hubacek, Klaus

    2008-09-01

    Water is a critical issue in China for a variety of reasons. China is poor of water resources with 2,300 m(3) of per capita availability, which is less than 13 of the world average. This is exacerbated by regional differences; e.g. North China's water availability is only about 271 m(3) of per capita value, which is only 125 of the world's average. Furthermore, pollution contributes to water scarcity and is a major source for diseases, particularly for the poor. The Ministry of Hydrology [1997. China's Regional Water Bullets. Water Resource and Hydro-power Publishing House, Beijing, China] reports that about 65-80% of rivers in North China no longer support any economic activities. Previous studies have emphasized the amount of water withdrawn but rarely take water quality into consideration. The quality of the return flows usually changes; the water quality being lower than the water flows that entered the production process initially. It is especially important to measure the impacts of wastewater to the hydro-ecosystem. Thus, water consumption should not only account for the amount of water inputs but also the amount of water contaminated in the hydro-ecosystem by the discharged wastewater. In this paper we present a new accounting and analytical approach based on economic input-output modelling combined with a mass balanced hydrological model that links interactions in the economic system with interactions in the hydrological system. We thus follow the tradition of integrated economic-ecologic input-output modelling. Our hydro-economic accounting framework and analysis tool allows tracking water consumption on the input side, water pollution leaving the economic system and water flows passing through the hydrological system thus enabling us to deal with water resources of different qualities. Following this method, the results illustrate that North China requires 96% of its annual available water, including both water inputs for the economy and contaminated water that is ineligible for any uses.

  3. Detection of nanomaterials in food and consumer products: bridging the gap from legislation to enforcement.

    PubMed

    Stamm, H; Gibson, N; Anklam, E

    2012-08-01

    This paper describes the requirements and resulting challenges for the implementation of current and upcoming European Union legislation referring to the use of nanomaterials in food, cosmetics and other consumer products. The European Commission has recently adopted a recommendation for the definition of nanomaterials. There is now an urgent need for appropriate and fit-for-purpose analytical methods in order to identify nanomaterials properly according to this definition and to assess whether or not a product contains nanomaterials. Considering the lack of such methods to date, this paper elaborates on the challenges of the legislative framework and the type of methods needed, not only to facilitate implementation of labelling requirements, but also to ensure the safety of products coming to the market. Considering the many challenges in the analytical process itself, such as interaction of nanoparticles with matrix constituents, potential agglomeration and aggregation due to matrix environment, broad variety of matrices, etc., there is a need for integrated analytical approaches, not only for sample preparation (e.g. separation from matrix), but also for the actual characterisation. Furthermore, there is an urgent need for quality assurance tools such as validated methods and (certified) reference materials, including materials containing nanoparticles in a realistic matrix (food products, cosmetics, etc.).

  4. Multidisciplinary design and analytic approaches to advance prospective research on the multilevel determinants of child health.

    PubMed

    Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R

    2017-06-01

    Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. SmartAdP: Visual Analytics of Large-scale Taxi Trajectories for Selecting Billboard Locations.

    PubMed

    Liu, Dongyu; Weng, Di; Li, Yuhong; Bao, Jie; Zheng, Yu; Qu, Huamin; Wu, Yingcai

    2017-01-01

    The problem of formulating solutions immediately and comparing them rapidly for billboard placements has plagued advertising planners for a long time, owing to the lack of efficient tools for in-depth analyses to make informed decisions. In this study, we attempt to employ visual analytics that combines the state-of-the-art mining and visualization techniques to tackle this problem using large-scale GPS trajectory data. In particular, we present SmartAdP, an interactive visual analytics system that deals with the two major challenges including finding good solutions in a huge solution space and comparing the solutions in a visual and intuitive manner. An interactive framework that integrates a novel visualization-driven data mining model enables advertising planners to effectively and efficiently formulate good candidate solutions. In addition, we propose a set of coupled visualizations: a solution view with metaphor-based glyphs to visualize the correlation between different solutions; a location view to display billboard locations in a compact manner; and a ranking view to present multi-typed rankings of the solutions. This system has been demonstrated using case studies with a real-world dataset and domain-expert interviews. Our approach can be adapted for other location selection problems such as selecting locations of retail stores or restaurants using trajectory data.

  6. Potassium sodium chloride integrated microconduits in a potentiometric analytical system.

    PubMed

    Hongbo, C; Junyan, S

    1991-09-01

    The preparation and application of a K(+), Na(+) and Cl(-) integrated microconduit potentiometric analytical system with tubular ion-selective electrodes (ISEs), microvalve, chemfold, electrostatic and pulse inhibitors is described. Electrochemical characteristics of the tubular ISEs and integrated microconduit FIA-ISEs were studied. The contents of K(+), Na(+) and Cl(-) in soil, water and serum were determined with the device. The analytical results agreed well with those obtained by flame photometric and silver nitrate volumetric methods.

  7. Environmental Stewardship: A Conceptual Review and Analytical Framework.

    PubMed

    Bennett, Nathan J; Whitty, Tara S; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  8. Environmental Stewardship: A Conceptual Review and Analytical Framework

    NASA Astrophysics Data System (ADS)

    Bennett, Nathan J.; Whitty, Tara S.; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H.

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  9. Integrating Allergen Analysis Within a Risk Assessment Framework: Approaches to Development of Targeted Mass Spectrometry Methods for Allergen Detection and Quantification in the iFAAM Project.

    PubMed

    Nitride, Chiara; Lee, Victoria; Baricevic-Jones, Ivona; Adel-Patient, Karine; Baumgartner, Sabine; Mills, E N Clare

    2018-01-01

    Allergen analysis is central to implementing and monitoring food allergen risk assessment and management processes by the food industry, but current methods for the determination of allergens in foods give highly variable results. The European Union-funded "Integrated Approaches to Food Allergen and Allergy Risk Management" (iFAAM) project has been working to address gaps in knowledge regarding food allergen management and analysis, including the development of novel MS and immuno-based allergen determination methods. Common allergenic food ingredients (peanut, hazelnut, walnut, cow's milk [Bos domesticus], and hen's egg [Gallus domesticus]) and common food matrixes (chocolate dessert and cookie) have been used for both clinical studies and analytical method development to ensure that the new methods are clinically relevant. Allergen molecules have been used as analytical targets and allergenic ingredients incurred into matrixes at levels close to reference doses that may trigger the use of precautionary allergen labeling. An interlaboratory method comparison has been undertaken for the determination of peanut in chocolate dessert using MS and immuno-based methods. The iFAAM approach has highlighted the need for methods to report test results in allergenic protein. This will allow food business operators to use them in risk assessments that are founded on clinical study data in which protein has been used as a measure of allergenic potency.

  10. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  11. Capacity-Delay Trade-Off in Collaborative Hybrid Ad-Hoc Networks with Coverage Sensing.

    PubMed

    Chen, Lingyu; Luo, Wenbin; Liu, Chen; Hong, Xuemin; Shi, Jianghong

    2017-01-26

    The integration of ad hoc device-to-device (D2D) communications and open-access small cells can result in a networking paradigm called hybrid the ad hoc network, which is particularly promising in delivering delay-tolerant data. The capacity-delay performance of hybrid ad hoc networks has been studied extensively under a popular framework called scaling law analysis. These studies, however, do not take into account aspects of interference accumulation and queueing delay and, therefore, may lead to over-optimistic results. Moreover, focusing on the average measures, existing works fail to give finer-grained insights into the distribution of delays. This paper proposes an alternative analytical framework based on queueing theoretic models and physical interference models. We apply this framework to study the capacity-delay performance of a collaborative cellular D2D network with coverage sensing and two-hop relay. The new framework allows us to fully characterize the delay distribution in the transform domain and pinpoint the impacts of coverage sensing, user and base station densities, transmit power, user mobility and packet size on the capacity-delay trade-off. We show that under the condition of queueing equilibrium, the maximum throughput capacity per device saturates to an upper bound of 0.7239 λ b / λ u bits/s/Hz, where λ b and λ u are the densities of base stations and mobile users, respectively.

  12. Capacity-Delay Trade-Off in Collaborative Hybrid Ad-Hoc Networks with Coverage Sensing

    PubMed Central

    Chen, Lingyu; Luo, Wenbin; Liu, Chen; Hong, Xuemin; Shi, Jianghong

    2017-01-01

    The integration of ad hoc device-to-device (D2D) communications and open-access small cells can result in a networking paradigm called hybrid the ad hoc network, which is particularly promising in delivering delay-tolerant data. The capacity-delay performance of hybrid ad hoc networks has been studied extensively under a popular framework called scaling law analysis. These studies, however, do not take into account aspects of interference accumulation and queueing delay and, therefore, may lead to over-optimistic results. Moreover, focusing on the average measures, existing works fail to give finer-grained insights into the distribution of delays. This paper proposes an alternative analytical framework based on queueing theoretic models and physical interference models. We apply this framework to study the capacity-delay performance of a collaborative cellular D2D network with coverage sensing and two-hop relay. The new framework allows us to fully characterize the delay distribution in the transform domain and pinpoint the impacts of coverage sensing, user and base station densities, transmit power, user mobility and packet size on the capacity-delay trade-off. We show that under the condition of queueing equilibrium, the maximum throughput capacity per device saturates to an upper bound of 0.7239 λb/λu bits/s/Hz, where λb and λu are the densities of base stations and mobile users, respectively. PMID:28134769

  13. Visual Analytics for the Food-Water-Energy Nexus in the Phoenix Active Management Area

    NASA Astrophysics Data System (ADS)

    Maciejewski, R.; Mascaro, G.; White, D. D.; Ruddell, B. L.; Aggarwal, R.; Sarjoughian, H.

    2016-12-01

    The Phoenix Active Management Area (AMA) is an administrative region of 14,500 km2 identified by the Arizona Department of Water Resources with the aim of reaching and maintaining the safe yield (i.e. balance between annual amount of groundwater withdrawn and recharged) by 2025. The AMA includes the Phoenix metropolitan area, which has experienced a dramatic population growth over the last decades with a progressive conversion of agricultural land into residential land. As a result of these changes, the water and energy demand as well as the food production in the region have significantly evolved over the last 30 years. Given the arid climate, a crucial role to support this growth has been the creation of a complex water supply system based on renewable and non-renewable resources, including the energy-intensive Central Arizona Project. In this talk, we present a preliminary characterization of the evolution in time of the feedbacks between food, water, and energy in the Phoenix AMA by analyzing secondary data (available from water and energy providers, irrigation districts, and municipalities), as well as satellite imagery and primary data collected by the authors. A preliminary visual analytics framework is also discussed describing current design practices and ideas for exploring networked components and cascading impacts within the FEW Nexus. This analysis and framework represent the first steps towards the development of an integrated modeling, visualization, and decision support infrastructure for comprehensive FEW systems decision making at decision-relevant temporal and spatial scales.

  14. An integrative framework for sensor-based measurement of teamwork in healthcare

    PubMed Central

    Rosen, Michael A; Dietz, Aaron S; Yang, Ting; Priebe, Carey E; Pronovost, Peter J

    2015-01-01

    There is a strong link between teamwork and patient safety. Emerging evidence supports the efficacy of teamwork improvement interventions. However, the availability of reliable, valid, and practical measurement tools and strategies is commonly cited as a barrier to long-term sustainment and spread of these teamwork interventions. This article describes the potential value of sensor-based technology as a methodology to measure and evaluate teamwork in healthcare. The article summarizes the teamwork literature within healthcare, including team improvement interventions and measurement. Current applications of sensor-based measurement of teamwork are reviewed to assess the feasibility of employing this approach in healthcare. The article concludes with a discussion highlighting current application needs and gaps and relevant analytical techniques to overcome the challenges to implementation. Compelling studies exist documenting the feasibility of capturing a broad array of team input, process, and output variables with sensor-based methods. Implications of this research are summarized in a framework for development of multi-method team performance measurement systems. Sensor-based measurement within healthcare can unobtrusively capture information related to social networks, conversational patterns, physical activity, and an array of other meaningful information without having to directly observe or periodically survey clinicians. However, trust and privacy concerns present challenges that need to be overcome through engagement of end users in healthcare. Initial evidence exists to support the feasibility of sensor-based measurement to drive feedback and learning across individual, team, unit, and organizational levels. Future research is needed to refine methods, technologies, theory, and analytical strategies. PMID:25053579

  15. Analytical optimization of demand management strategies across all urban water use sectors

    NASA Astrophysics Data System (ADS)

    Friedman, Kenneth; Heaney, James P.; Morales, Miguel; Palenchar, John

    2014-07-01

    An effective urban water demand management program can greatly influence both peak and average demand and therefore long-term water supply and infrastructure planning. Although a theoretical framework for evaluating residential indoor demand management has been well established, little has been done to evaluate other water use sectors such as residential irrigation in a compatible manner for integrating these results into an overall solution. This paper presents a systematic procedure to evaluate the optimal blend of single family residential irrigation demand management strategies to achieve a specified goal based on performance functions derived from parcel level tax assessor's data linked to customer level monthly water billing data. This framework is then generalized to apply to any urban water sector, as exponential functions can be fit to all resulting cumulative water savings functions. Two alternative formulations are presented: maximize net benefits, or minimize total costs subject to satisfying a target water savings. Explicit analytical solutions are presented for both formulations based on appropriate exponential best fits of performance functions. A direct result of this solution is the dual variable which represents the marginal cost of water saved at a specified target water savings goal. A case study of 16,303 single family irrigators in Gainesville Regional Utilities utilizing high quality tax assessor and monthly billing data along with parcel level GIS data provide an illustrative example of these techniques. Spatial clustering of targeted homes can be easily performed in GIS to identify priority demand management areas.

  16. Information Tailoring Enhancements for Large Scale Social Data

    DTIC Science & Technology

    2016-03-15

    i.com) 1 Work Performed within This Reporting Period .................................................... 2 1.1 Implemented Temporal Analytics ...following tasks.  Implemented Temporal Analysis Algorithms for Advanced Analytics in Scraawl. We implemented our backend web service design for the...temporal analysis and we created a prototyope GUI web service of Scraawl analytics dashboard.  Upgraded Scraawl computational framework to increase

  17. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  18. Energy-Water Nexus Knowledge Discovery Framework

    NASA Astrophysics Data System (ADS)

    Bhaduri, B. L.; Foster, I.; Chandola, V.; Chen, B.; Sanyal, J.; Allen, M.; McManamay, R.

    2017-12-01

    As demand for energy grows, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. An integrated data driven modeling, analysis, and visualization capability is needed to understand, design, and develop efficient local and regional practices for the energy-water infrastructure components that can be guided with strategic (federal) policy decisions to ensure national energy resilience. To meet this need of the energy-water nexus (EWN) community, an Energy-Water Knowledge Discovery Framework (EWN-KDF) is being proposed to accomplish two objectives: Development of a robust data management and geovisual analytics platform that provides access to disparate and distributed physiographic, critical infrastructure, and socioeconomic data, along with emergent ad-hoc sensor data to provide a powerful toolkit of analysis algorithms and compute resources to empower user-guided data analysis and inquiries; and Demonstration of knowledge generation with selected illustrative use cases for the implications of climate variability for coupled land-water-energy systems through the application of state-of-the art data integration, analysis, and synthesis. Oak Ridge National Laboratory (ORNL), in partnership with Argonne National Laboratory (ANL) and researchers affiliated with the Center for International Earth Science Information Partnership (CIESIN) at Columbia University and State University of New York-Buffalo (SUNY), propose to develop this Energy-Water Knowledge Discovery Framework to generate new, critical insights regarding the complex dynamics of the EWN and its interactions with climate variability and change. An overarching objective of this project is to integrate impacts, adaptation, and vulnerability (IAV) science with emerging data science to meet the data analysis needs of the U.S. Department of Energy and partner federal agencies with respect to the EWN.

  19. Integrated primary care, the collaboration imperative inter-organizational cooperation in the integrated primary care field: a theoretical framework

    PubMed Central

    Valentijn, Pim P; Bruijnzeels, Marc A; de Leeuw, Rob J; Schrijvers, Guus J.P

    2012-01-01

    Purpose Capacity problems and political pressures have led to a rapid change in the organization of primary care from mono disciplinary small business to complex inter-organizational relationships. It is assumed that inter-organizational collaboration is the driving force to achieve integrated (primary) care. Despite the importance of collaboration and integration of services in primary care, there is no unambiguous definition for both concepts. The purpose of this study is to examine and link the conceptualisation and validation of the terms inter-organizational collaboration and integrated primary care using a theoretical framework. Theory The theoretical framework is based on the complex collaboration process of negotiation among multiple stakeholder groups in primary care. Methods A literature review of health sciences and business databases, and targeted grey literature sources. Based on the literature review we operationalized the constructs of inter-organizational collaboration and integrated primary care in a theoretical framework. The framework is being validated in an explorative study of 80 primary care projects in the Netherlands. Results and conclusions Integrated primary care is considered as a multidimensional construct based on a continuum of integration, extending from segregation to integration. The synthesis of the current theories and concepts of inter-organizational collaboration is insufficient to deal with the complexity of collaborative issues in primary care. One coherent and integrated theoretical framework was found that could make the complex collaboration process in primary care transparent. This study presented theoretical framework is a first step to understand the patterns of successful collaboration and integration in primary care services. These patterns can give insights in the organization forms needed to create a good working integrated (primary) care system that fits the local needs of a population. Preliminary data of the patterns of collaboration and integration will be presented.

  20. Many-objective reservoir policy identification and refinement to reduce policy inertia and myopia in water management

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Herman, J. D.; Castelletti, A.; Reed, P.

    2014-04-01

    This study contributes a decision analytic framework to overcome policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification, many-objective optimization under uncertainty, and visual analytics to characterize current operations and discover key trade-offs between alternative policies for balancing competing demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. We have identified a baseline operating policy for the Conowingo Dam that closely reproduces the dynamics of current releases and flows for the Lower Susquehanna and thus can be used to represent the preferences structure guiding current operations. Starting from this baseline policy, our proposed decision analytic framework then combines evolutionary many-objective optimization with visual analytics to discover new operating policies that better balance the trade-offs within the Lower Susquehanna. Our results confirm that the baseline operating policy, which only considers deterministic historical inflows, significantly overestimates the system's reliability in meeting the reservoir's competing demands. Our proposed framework removes this bias by successfully identifying alternative reservoir policies that are more robust to hydroclimatic uncertainties while also better addressing the trade-offs across the Conowingo Dam's multisector services.

  1. The Climate Data Analytic Services (CDAS) Framework.

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2016-12-01

    Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.

  2. Obtaining Arbitrary Prescribed Mean Field Dynamics for Recurrently Coupled Networks of Type-I Spiking Neurons with Analytically Determined Weights

    PubMed Central

    Nicola, Wilten; Tripp, Bryan; Scott, Matthew

    2016-01-01

    A fundamental question in computational neuroscience is how to connect a network of spiking neurons to produce desired macroscopic or mean field dynamics. One possible approach is through the Neural Engineering Framework (NEF). The NEF approach requires quantities called decoders which are solved through an optimization problem requiring large matrix inversion. Here, we show how a decoder can be obtained analytically for type I and certain type II firing rates as a function of the heterogeneity of its associated neuron. These decoders generate approximants for functions that converge to the desired function in mean-squared error like 1/N, where N is the number of neurons in the network. We refer to these decoders as scale-invariant decoders due to their structure. These decoders generate weights for a network of neurons through the NEF formula for weights. These weights force the spiking network to have arbitrary and prescribed mean field dynamics. The weights generated with scale-invariant decoders all lie on low dimensional hypersurfaces asymptotically. We demonstrate the applicability of these scale-invariant decoders and weight surfaces by constructing networks of spiking theta neurons that replicate the dynamics of various well known dynamical systems such as the neural integrator, Van der Pol system and the Lorenz system. As these decoders are analytically determined and non-unique, the weights are also analytically determined and non-unique. We discuss the implications for measured weights of neuronal networks. PMID:26973503

  3. Obtaining Arbitrary Prescribed Mean Field Dynamics for Recurrently Coupled Networks of Type-I Spiking Neurons with Analytically Determined Weights.

    PubMed

    Nicola, Wilten; Tripp, Bryan; Scott, Matthew

    2016-01-01

    A fundamental question in computational neuroscience is how to connect a network of spiking neurons to produce desired macroscopic or mean field dynamics. One possible approach is through the Neural Engineering Framework (NEF). The NEF approach requires quantities called decoders which are solved through an optimization problem requiring large matrix inversion. Here, we show how a decoder can be obtained analytically for type I and certain type II firing rates as a function of the heterogeneity of its associated neuron. These decoders generate approximants for functions that converge to the desired function in mean-squared error like 1/N, where N is the number of neurons in the network. We refer to these decoders as scale-invariant decoders due to their structure. These decoders generate weights for a network of neurons through the NEF formula for weights. These weights force the spiking network to have arbitrary and prescribed mean field dynamics. The weights generated with scale-invariant decoders all lie on low dimensional hypersurfaces asymptotically. We demonstrate the applicability of these scale-invariant decoders and weight surfaces by constructing networks of spiking theta neurons that replicate the dynamics of various well known dynamical systems such as the neural integrator, Van der Pol system and the Lorenz system. As these decoders are analytically determined and non-unique, the weights are also analytically determined and non-unique. We discuss the implications for measured weights of neuronal networks.

  4. Quantitative Story Telling: Initial steps towards bridging perspectives and tools for a robust nexus assessment

    NASA Astrophysics Data System (ADS)

    Cabello, Violeta

    2017-04-01

    This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.

  5. Understanding integrated care: a comprehensive conceptual framework based on the integrative functions of primary care

    PubMed Central

    Valentijn, Pim P.; Schepman, Sanneke M.; Opheij, Wilfrid; Bruijnzeels, Marc A.

    2013-01-01

    Introduction Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. Methods The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. Results The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. Discussion The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective. PMID:23687482

  6. Understanding integrated care: a comprehensive conceptual framework based on the integrative functions of primary care.

    PubMed

    Valentijn, Pim P; Schepman, Sanneke M; Opheij, Wilfrid; Bruijnzeels, Marc A

    2013-01-01

    Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective.

  7. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    NASA Astrophysics Data System (ADS)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  8. An Analytical Framework for the Steady State Impact of Carbonate Compensation on Atmospheric CO2

    NASA Astrophysics Data System (ADS)

    Omta, Anne Willem; Ferrari, Raffaele; McGee, David

    2018-04-01

    The deep-ocean carbonate ion concentration impacts the fraction of the marine calcium carbonate production that is buried in sediments. This gives rise to the carbonate compensation feedback, which is thought to restore the deep-ocean carbonate ion concentration on multimillennial timescales. We formulate an analytical framework to investigate the impact of carbonate compensation under various changes in the carbon cycle relevant for anthropogenic change and glacial cycles. Using this framework, we show that carbonate compensation amplifies by 15-20% changes in atmospheric CO2 resulting from a redistribution of carbon between the atmosphere and ocean (e.g., due to changes in temperature, salinity, or nutrient utilization). A counterintuitive result emerges when the impact of organic matter burial in the ocean is examined. The organic matter burial first leads to a slight decrease in atmospheric CO2 and an increase in the deep-ocean carbonate ion concentration. Subsequently, enhanced calcium carbonate burial leads to outgassing of carbon from the ocean to the atmosphere, which is quantified by our framework. Results from simulations with a multibox model including the minor acids and bases important for the ocean-atmosphere exchange of carbon are consistent with our analytical predictions. We discuss the potential role of carbonate compensation in glacial-interglacial cycles as an example of how our theoretical framework may be applied.

  9. Closed-form solutions and scaling laws for Kerr frequency combs

    PubMed Central

    Renninger, William H.; Rakich, Peter T.

    2016-01-01

    A single closed-form analytical solution of the driven nonlinear Schrödinger equation is developed, reproducing a large class of the behaviors in Kerr-comb systems, including bright-solitons, dark-solitons, and a large class of periodic wavetrains. From this analytical framework, a Kerr-comb area theorem and a pump-detuning relation are developed, providing new insights into soliton- and wavetrain-based combs along with concrete design guidelines for both. This new area theorem reveals significant deviation from the conventional soliton area theorem, which is crucial to understanding cavity solitons in certain limits. Moreover, these closed-form solutions represent the first step towards an analytical framework for wavetrain formation, and reveal new parameter regimes for enhanced Kerr-comb performance. PMID:27108810

  10. On Connectivity of Wireless Sensor Networks with Directional Antennas

    PubMed Central

    Wang, Qiu; Dai, Hong-Ning; Zheng, Zibin; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    In this paper, we investigate the network connectivity of wireless sensor networks with directional antennas. In particular, we establish a general framework to analyze the network connectivity while considering various antenna models and the channel randomness. Since existing directional antenna models have their pros and cons in the accuracy of reflecting realistic antennas and the computational complexity, we propose a new analytical directional antenna model called the iris model to balance the accuracy against the complexity. We conduct extensive simulations to evaluate the analytical framework. Our results show that our proposed analytical model on the network connectivity is accurate, and our iris antenna model can provide a better approximation to realistic directional antennas than other existing antenna models. PMID:28085081

  11. Developing an Analytical Framework for Argumentation on Energy Consumption Issues

    ERIC Educational Resources Information Center

    Jin, Hui; Mehl, Cathy E.; Lan, Deborah H.

    2015-01-01

    In this study, we aimed to develop a framework for analyzing the argumentation practice of high school students and high school graduates. We developed the framework in a specific context--how energy consumption activities such as changing diet, converting forests into farmlands, and choosing transportation modes affect the carbon cycle. The…

  12. A Theoretical Framework of the Relation between Socioeconomic Status and Academic Achievement of Students

    ERIC Educational Resources Information Center

    Lam, Gigi

    2014-01-01

    A socio-psychological analytical framework will be adopted to illuminate the relation between socioeconomic status and academic achievement. The framework puts the emphasis to incorporate micro familial factors into macro factor of the tracking system. Initially, children of the poor families always lack major prerequisite: diminution of cognitive…

  13. European Qualifications Framework: Weighing Some Pros and Cons out of a French Perspective

    ERIC Educational Resources Information Center

    Bouder, Annie

    2008-01-01

    Purpose: The purpose of this paper is to question the appropriateness of a proposal for a new European Qualifications Framework. The framework has three perspectives: historical; analytical; and national. Design/methodology/approach: The approaches are diverse since the first insists on the institutional and decision-making processes at European…

  14. [The development of European Union common research and development policy and programs with special regard to life sciences].

    PubMed

    Pörzse, Gábor

    2009-08-09

    Research and development (R&D) has been playing a leading role in the European Community's history since the very beginning of European integration. Its importance has grown in recent years, after the launch of the Lisbon strategy. Framework programs have always played a considerable part in community research. The aim of their introduction was to fine tune national R&D activities, and to successfully divide research tasks between the Community and the member states. The Community, from the very outset, has acknowledged the importance of life sciences. It is no coincidence that life sciences have become the second biggest priority in the last two framework programs. This study provides a historical, and at the same time analytical and evaluative review of community R&D policy and activity from the starting point of its development until the present day. It examines in detail how the changes in structure, conditional system, regulations and priorities of the framework programs have followed the formation of social and economic needs. The paper puts special emphasis on the analysis of the development of life science research, presenting how they have met the challenges of the age, and how they have been built into the framework programs. Another research area of the present study is to elaborate how successfully Hungarian researchers have been joining the community research, especially the framework programs in the field of life sciences. To answer these questions, it was essential to survey, process and analyze the data available in the national and European public and closed databases. Contrary to the previous documents, this analysis doesn't concentrate on the political and scientific background. It outlines which role community research has played in sustainable social and economic development and competitiveness, how it has supported common policies and how the processes of integration have been deepening. Besides, the present paper offers a complete review of the given field, from its foundation up until the present day, by elaborating the newest initiatives and ideas for the future. This work is also novel from the point of view of the given professional field, the life sciences in the framework programs, and processing and evaluating of data of Hungarian participation in the 5th and 6th framework programs in the field of life sciences.

  15. [The socio-hygienic monitoring as an integral system for health risk assessment and risk management at the regional level].

    PubMed

    Kuzmin, S V; Gurvich, V B; Dikonskaya, O V; Malykh, O L; Yarushin, S V; Romanov, S V; Kornilkov, A S

    2013-01-01

    The information and analytical framework for the introduction of health risk assessment and risk management methodologies in the Sverdlovsk Region is the system of socio-hygienic monitoring. Techniques of risk management that take into account the choice of most cost-effective and efficient actions for improvement of the sanitary and epidemiologic situation at the level of the region, municipality, or a business entity of the Russian Federation, have been developed and proposed. To assess the efficiency of planning and activities for health risk management common method approaches and economic methods of "cost-effectiveness" and "cost-benefit" analyses provided in method recommendations and introduced in the Russian Federation are applied.

  16. The Earth Microbiome Project and Global Systems Biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Jack A.; Jansson, Janet K.; Knight, Rob

    Recently, we published the first large-scale analysis of data from the Earth Microbiome Project (1, 2), a truly multidisciplinary research program involving more than 500 scientists and 27,751 samples acquired from 43 countries. These samples represent myriad specimen types and span a wide range of biotic and abiotic factors, geographic locations, and physicochemical properties. The database (https://qiita.ucsd.edu/emp/) is still growing, with over 90,000 amplicon datasets, >500 metagenomic runs, and metabolomics datasets from a similar number of samples. Importantly, the techniques, data and analytical tools are all standardized and publicly accessible, providing a framework to support research at a scale ofmore » integration that just 7 years ago seemed impossible.« less

  17. Geometric quantification of features in large flow fields.

    PubMed

    Kendall, Wesley; Huang, Jian; Peterka, Tom

    2012-01-01

    Interactive exploration of flow features in large-scale 3D unsteady-flow data is one of the most challenging visualization problems today. To comprehensively explore the complex feature spaces in these datasets, a proposed system employs a scalable framework for investigating a multitude of characteristics from traced field lines. This capability supports the examination of various neighborhood-based geometric attributes in concert with other scalar quantities. Such an analysis wasn't previously possible because of the large computational overhead and I/O requirements. The system integrates visual analytics methods by letting users procedurally and interactively describe and extract high-level flow features. An exploration of various phenomena in a large global ocean-modeling simulation demonstrates the approach's generality and expressiveness as well as its efficacy.

  18. A multivariate distance-based analytic framework for microbial interdependence association test in longitudinal study.

    PubMed

    Zhang, Yilong; Han, Sung Won; Cox, Laura M; Li, Huilin

    2017-12-01

    Human microbiome is the collection of microbes living in and on the various parts of our body. The microbes living on our body in nature do not live alone. They act as integrated microbial community with massive competing and cooperating and contribute to our human health in a very important way. Most current analyses focus on examining microbial differences at a single time point, which do not adequately capture the dynamic nature of the microbiome data. With the advent of high-throughput sequencing and analytical tools, we are able to probe the interdependent relationship among microbial species through longitudinal study. Here, we propose a multivariate distance-based test to evaluate the association between key phenotypic variables and microbial interdependence utilizing the repeatedly measured microbiome data. Extensive simulations were performed to evaluate the validity and efficiency of the proposed method. We also demonstrate the utility of the proposed test using a well-designed longitudinal murine experiment and a longitudinal human study. The proposed methodology has been implemented in the freely distributed open-source R package and Python code. © 2017 WILEY PERIODICALS, INC.

  19. The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox

    NASA Astrophysics Data System (ADS)

    Harris, A. T., III; Goodman, J.; Justice, B.

    2014-12-01

    As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.

  20. Integrating multi-criteria evaluation techniques with geographic information systems for landfill site selection: a case study using ordered weighted average.

    PubMed

    Gorsevski, Pece V; Donevska, Katerina R; Mitrovski, Cvetko D; Frizado, Joseph P

    2012-02-01

    This paper presents a GIS-based multi-criteria decision analysis approach for evaluating the suitability for landfill site selection in the Polog Region, Macedonia. The multi-criteria decision framework considers environmental and economic factors which are standardized by fuzzy membership functions and combined by integration of analytical hierarchy process (AHP) and ordered weighted average (OWA) techniques. The AHP is used for the elicitation of attribute weights while the OWA operator function is used to generate a wide range of decision alternatives for addressing uncertainty associated with interaction between multiple criteria. The usefulness of the approach is illustrated by different OWA scenarios that report landfill suitability on a scale between 0 and 1. The OWA scenarios are intended to quantify the level of risk taking (i.e., optimistic, pessimistic, and neutral) and to facilitate a better understanding of patterns that emerge from decision alternatives involved in the decision making process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. blend4php: a PHP API for galaxy

    PubMed Central

    Wytko, Connor; Soto, Brian; Ficklin, Stephen P.

    2017-01-01

    Galaxy is a popular framework for execution of complex analytical pipelines typically for large data sets, and is a commonly used for (but not limited to) genomic, genetic and related biological analysis. It provides a web front-end and integrates with high performance computing resources. Here we report the development of the blend4php library that wraps Galaxy’s RESTful API into a PHP-based library. PHP-based web applications can use blend4php to automate execution, monitoring and management of a remote Galaxy server, including its users, workflows, jobs and more. The blend4php library was specifically developed for the integration of Galaxy with Tripal, the open-source toolkit for the creation of online genomic and genetic web sites. However, it was designed as an independent library for use by any application, and is freely available under version 3 of the GNU Lesser General Public License (LPGL v3.0) at https://github.com/galaxyproject/blend4php. Database URL: https://github.com/galaxyproject/blend4php PMID:28077564

  2. A new lizard species of the Phymaturus patagonicus group (Squamata: Liolaemini) from northern Patagonia, Neuquén, Argentina.

    PubMed

    Marín, Andrea González; Pérez, Cristian Hernán Fulvio; Minoli, Ignacio; Morando, Mariana; Avila, Luciano Javier

    2016-06-10

    The integrative taxonomy framework allows developing robust hypotheses of species limits based on the integration of results from different data sets and analytical methods. In this work, we test a candidate species hypothesis previously suggested based on molecular data, with geometric and traditional morphometrics analyses (multivariate and univariate). This new lizard species is part of the Phymaturus patagonicus group (payuniae clade) that is distributed in Neuquén and Mendoza provinces (Argentina). Our results showed that Phymaturus rahuensis sp. nov. differs from the other species of the payuniae clade by a higher number of midbody scales, and fewer supralabials scales, finger lamellae and toe lamellae. Also, its multidimensional spaces, both based on continuous lineal variables and geometric morphometrics (shape) characters, do not overlap with those of the other species in this clade. The results of the morphometric and geometric morphometric analyses presented here, coupled with previously published molecular data, represent three independent lines of evidence that support the diagnosis of this new taxon.

  3. The steady aerodynamics of aerofoils with porosity gradients.

    PubMed

    Hajian, Rozhin; Jaworski, Justin W

    2017-09-01

    This theoretical study determines the aerodynamic loads on an aerofoil with a prescribed porosity distribution in a steady incompressible flow. A Darcy porosity condition on the aerofoil surface furnishes a Fredholm integral equation for the pressure distribution, which is solved exactly and generally as a Riemann-Hilbert problem provided that the porosity distribution is Hölder-continuous. The Hölder condition includes as a subset any continuously differentiable porosity distributions that may be of practical interest. This formal restriction on the analysis is examined by a class of differentiable porosity distributions that approach a piecewise, discontinuous function in a certain parametric limit. The Hölder-continuous solution is verified in this limit against analytical results for partially porous aerofoils in the literature. Finally, a comparison made between the new theoretical predictions and experimental measurements of SD7003 aerofoils presented in the literature. Results from this analysis may be integrated into a theoretical framework to optimize turbulence noise suppression with minimal impact to aerodynamic performance.

  4. The steady aerodynamics of aerofoils with porosity gradients

    NASA Astrophysics Data System (ADS)

    Hajian, Rozhin; Jaworski, Justin W.

    2017-09-01

    This theoretical study determines the aerodynamic loads on an aerofoil with a prescribed porosity distribution in a steady incompressible flow. A Darcy porosity condition on the aerofoil surface furnishes a Fredholm integral equation for the pressure distribution, which is solved exactly and generally as a Riemann-Hilbert problem provided that the porosity distribution is Hölder-continuous. The Hölder condition includes as a subset any continuously differentiable porosity distributions that may be of practical interest. This formal restriction on the analysis is examined by a class of differentiable porosity distributions that approach a piecewise, discontinuous function in a certain parametric limit. The Hölder-continuous solution is verified in this limit against analytical results for partially porous aerofoils in the literature. Finally, a comparison made between the new theoretical predictions and experimental measurements of SD7003 aerofoils presented in the literature. Results from this analysis may be integrated into a theoretical framework to optimize turbulence noise suppression with minimal impact to aerodynamic performance.

  5. Systematic Characterization and Analysis of the Taxonomic Drivers of Functional Shifts in the Human Microbiome.

    PubMed

    Manor, Ohad; Borenstein, Elhanan

    2017-02-08

    Comparative analyses of the human microbiome have identified both taxonomic and functional shifts that are associated with numerous diseases. To date, however, microbiome taxonomy and function have mostly been studied independently and the taxonomic drivers of functional imbalances have not been systematically identified. Here, we present FishTaco, an analytical and computational framework that integrates taxonomic and functional comparative analyses to accurately quantify taxon-level contributions to disease-associated functional shifts. Applying FishTaco to several large-scale metagenomic cohorts, we show that shifts in the microbiome's functional capacity can be traced back to specific taxa. Furthermore, the set of taxa driving functional shifts and their contribution levels vary markedly between functions. We additionally find that similar functional imbalances in different diseases are driven by both disease-specific and shared taxa. Such integrated analysis of microbiome ecological and functional dynamics can inform future microbiome-based therapy, pinpointing putative intervention targets for manipulating the microbiome's functional capacity. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Urban Partnership Agreement and Congestion Reduction Demonstration : National Evaluation Framework

    DOT National Transportation Integrated Search

    2008-11-21

    This report provides an analytical framework for evaluating six deployments under the United States Department of Transportation (U.S. DOT) Urban Partnership Agreement (UPA) and Congestion Reduction Demonstration (CRD) Programs. The six UPA/CRD sites...

  7. HIV/AIDS, growth and poverty in KwaZulu-Natal and South Africa: an integrated survey, demographic and economy-wide analysis

    PubMed Central

    2009-01-01

    Background This paper estimates the economic impact of HIV/AIDS on the KwaZulu-Natal province and the rest of South Africa. Methods We extended previous studies by employing: an integrated analytical framework that combined firm surveys of workers' HIV prevalence by sector and occupation; a demographic model that produced both population and workforce projections; and a regionalized economy-wide model linked to a survey-based micro-simulation module. This framework permits a full macro-microeconomic assessment. Results Results indicate that HIV/AIDS greatly reduces annual economic growth, mainly by lowering the long-run rate of technical change. However, impacts on income poverty are small, and inequality is reduced by HIV/AIDS. This is because high unemployment among low-income households minimises the economic costs of increased mortality. By contrast, slower economic growth hurts higher income households despite lower HIV prevalence. Conclusion We conclude that the increase in economic growth that results from addressing HIV/AIDS is sufficient to offset the population pressure placed on income poverty. Moreover, incentives to mitigate HIV/AIDS lie not only with poorer infected households, but also with uninfected higher income households. Our findings reveal the substantial burden that HIV/AIDS places on future economic development in KwaZulu-Natal and South Africa, and confirms the need for policies to curb the economic costs of the pandemic. PMID:19758444

  8. Systems analysis - a new paradigm and decision support tools for the water framework directive

    NASA Astrophysics Data System (ADS)

    Bruen, M.

    2007-06-01

    In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness. This is best done by trained sociologists fully integrated into the processes. The WINCOMS research project is an example applied to the implementation of the WFD in Ireland.

  9. Finite element based N-Port model for preliminary design of multibody systems

    NASA Astrophysics Data System (ADS)

    Sanfedino, Francesco; Alazard, Daniel; Pommier-Budinger, Valérie; Falcoz, Alexandre; Boquet, Fabrice

    2018-02-01

    This article presents and validates a general framework to build a linear dynamic Finite Element-based model of large flexible structures for integrated Control/Structure design. An extension of the Two-Input Two-Output Port (TITOP) approach is here developed. The authors had already proposed such framework for simple beam-like structures: each beam was considered as a TITOP sub-system that could be interconnected to another beam thanks to the ports. The present work studies bodies with multiple attaching points by allowing complex interconnections among several sub-structures in tree-like assembly. The TITOP approach is extended to generate NINOP (N-Input N-Output Port) models. A Matlab toolbox is developed integrating beam and bending plate elements. In particular a NINOP formulation of bending plates is proposed to solve analytic two-dimensional problems. The computation of NINOP models using the outputs of a MSC/Nastran modal analysis is also investigated in order to directly use the results provided by a commercial finite element software. The main advantage of this tool is to provide a model of a multibody system under the form of a block diagram with a minimal number of states. This model is easy to operate for preliminary design and control. An illustrative example highlights the potential of the proposed approach: the synthesis of the dynamical model of a spacecraft with two deployable and flexible solar arrays.

  10. Social Network Analysis and Nutritional Behavior: An Integrated Modeling Approach

    PubMed Central

    Senior, Alistair M.; Lihoreau, Mathieu; Buhl, Jerome; Raubenheimer, David; Simpson, Stephen J.

    2016-01-01

    Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent research combining state-space models of nutritional geometry with agent-based models (ABMs), show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit ABMs that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition). Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interactions in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments. PMID:26858671

  11. Programming chemistry in DNA-addressable bioreactors.

    PubMed

    Fellermann, Harold; Cardelli, Luca

    2014-10-06

    We present a formal calculus, termed the chemtainer calculus, able to capture the complexity of compartmentalized reaction systems such as populations of possibly nested vesicular compartments. Compartments contain molecular cargo as well as surface markers in the form of DNA single strands. These markers serve as compartment addresses and allow for their targeted transport and fusion, thereby enabling reactions of previously separated chemicals. The overall system organization allows for the set-up of programmable chemistry in microfluidic or other automated environments. We introduce a simple sequential programming language whose instructions are motivated by state-of-the-art microfluidic technology. Our approach integrates electronic control, chemical computing and material production in a unified formal framework that is able to mimic the integrated computational and constructive capabilities of the subcellular matrix. We provide a non-deterministic semantics of our programming language that enables us to analytically derive the computational and constructive power of our machinery. This semantics is used to derive the sets of all constructable chemicals and supermolecular structures that emerge from different underlying instruction sets. Because our proofs are constructive, they can be used to automatically infer control programs for the construction of target structures from a limited set of resource molecules. Finally, we present an example of our framework from the area of oligosaccharide synthesis. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  12. Evaluating integrated watershed management using multiple criteria analysis--a case study at Chittagong Hill Tracts in Bangladesh.

    PubMed

    Biswas, Shampa; Vacik, Harald; Swanson, Mark E; Haque, S M Sirajul

    2012-05-01

    Criteria and indicators assessment is one of the ways to evaluate management strategies for mountain watersheds. One framework for this, Integrated Watershed Management (IWM), was employed at Chittagong Hill Tracts region of Bangladesh using a multi-criteria analysis approach. The IWM framework, consisting of the design and application of principles, criteria, indicators, and verifiers (PCIV), facilitates active participation by diverse professionals, experts, and interest groups in watershed management, to explicitly address the demands and problems to measure the complexity of problems in a transparent and understandable way. Management alternatives are developed to fulfill every key component of IWM considering the developed PCIV set and current situation of the study area. Different management strategies, each focusing on a different approach (biodiversity conservation, flood control, soil and water quality conservation, indigenous knowledge conservation, income generation, watershed conservation, and landscape conservation) were assessed qualitatively on their potential to improve the current situation according to each verifier of the criteria and indicator set. Analytic Hierarchy Process (AHP), including sensitivity analysis, was employed to identify an appropriate management strategy according to overall priorities (i.e., different weights of each principle) of key informants. The AHP process indicated that a strategy focused on conservation of biodiversity provided the best option to address watershed-related challenges in the Chittagong Hill Tracts, Bangladesh.

  13. Life-Course Accumulation of Neighborhood Disadvantage and Allostatic Load: Empirical Integration of Three Social Determinants of Health Frameworks

    PubMed Central

    Gustafsson, Per E.; San Sebastian, Miguel; Janlert, Urban; Theorell, Töres; Westerlund, Hugo; Hammarström, Anne

    2014-01-01

    Objectives. We examined if the accumulation of neighborhood disadvantages from adolescence to mid-adulthood were related to allostatic load, a measure of cumulative biological risk, in mid-adulthood, and explored whether this association was similar in women and men. Methods. Data were from the participants in the Northern Swedish Cohort (analytical n = 818) at ages 16, 21, 30, and 43 years in 1981, 1986, 1995, and 2008. Personal living conditions were self-reported at each wave. At age 43 years, 12 biological markers were measured to operationalize allostatic load. Registered data for all residents in the cohort participants’ neighborhoods at each wave were used to construct a cumulative measure of neighborhood disadvantage. Associations were examined in ordinary least-squares regression models. Results. We found that cumulative neighborhood disadvantage between ages 16 and 43 years was related to higher allostatic load at age 43 years after adjusting for personal living conditions in the total sample (B = 0.11; P = .004) and in men (B = 0.16; P = .004), but not in women (B = 0.07; P = .248). Conclusions. Our findings suggested that neighborhood disadvantage acted cumulatively over the life course on biological wear and tear, and exemplified the gains of integrating social determinants of health frameworks. PMID:24625161

  14. Scaling Student Success with Predictive Analytics: Reflections after Four Years in the Data Trenches

    ERIC Educational Resources Information Center

    Wagner, Ellen; Longanecker, David

    2016-01-01

    The metrics used in the US to track students do not include adults and part-time students. This has led to the development of a massive data initiative--the Predictive Analytics Reporting (PAR) framework--that uses predictive analytics to trace the progress of all types of students in the system. This development has allowed actionable,…

  15. Development of a conceptual framework toward an integrated transportation system : final report, April 10, 2009.

    DOT National Transportation Integrated Search

    2009-04-10

    This report documents research on the conceptual framework of an integrated transportation system with a prototype application under the framework. Three levels of control are involved in this framework: at the global level (an entire transportation ...

  16. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  17. Branch length estimation and divergence dating: estimates of error in Bayesian and maximum likelihood frameworks.

    PubMed

    Schwartz, Rachel S; Mueller, Rachel L

    2010-01-11

    Estimates of divergence dates between species improve our understanding of processes ranging from nucleotide substitution to speciation. Such estimates are frequently based on molecular genetic differences between species; therefore, they rely on accurate estimates of the number of such differences (i.e. substitutions per site, measured as branch length on phylogenies). We used simulations to determine the effects of dataset size, branch length heterogeneity, branch depth, and analytical framework on branch length estimation across a range of branch lengths. We then reanalyzed an empirical dataset for plethodontid salamanders to determine how inaccurate branch length estimation can affect estimates of divergence dates. The accuracy of branch length estimation varied with branch length, dataset size (both number of taxa and sites), branch length heterogeneity, branch depth, dataset complexity, and analytical framework. For simple phylogenies analyzed in a Bayesian framework, branches were increasingly underestimated as branch length increased; in a maximum likelihood framework, longer branch lengths were somewhat overestimated. Longer datasets improved estimates in both frameworks; however, when the number of taxa was increased, estimation accuracy for deeper branches was less than for tip branches. Increasing the complexity of the dataset produced more misestimated branches in a Bayesian framework; however, in an ML framework, more branches were estimated more accurately. Using ML branch length estimates to re-estimate plethodontid salamander divergence dates generally resulted in an increase in the estimated age of older nodes and a decrease in the estimated age of younger nodes. Branch lengths are misestimated in both statistical frameworks for simulations of simple datasets. However, for complex datasets, length estimates are quite accurate in ML (even for short datasets), whereas few branches are estimated accurately in a Bayesian framework. Our reanalysis of empirical data demonstrates the magnitude of effects of Bayesian branch length misestimation on divergence date estimates. Because the length of branches for empirical datasets can be estimated most reliably in an ML framework when branches are <1 substitution/site and datasets are > or =1 kb, we suggest that divergence date estimates using datasets, branch lengths, and/or analytical techniques that fall outside of these parameters should be interpreted with caution.

  18. Integrative Mixed Methods Data Analytic Strategies in Research on School Success in Challenging Circumstances

    ERIC Educational Resources Information Center

    Jang, Eunice E.; McDougall, Douglas E.; Pollon, Dawn; Herbert, Monique; Russell, Pia

    2008-01-01

    There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging…

  19. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase one, volume 4 : use of knowledge integrated visual analytics system in supporting bridge management.

    DOT National Transportation Integrated Search

    2009-12-01

    The goals of integration should be: Supporting domain oriented data analysis through the use of : knowledge augmented visual analytics system. In this project, we focus on: : Providing interactive data exploration for bridge managements. : ...

  20. Some classes of analytic functions involving Noor integral operator

    NASA Astrophysics Data System (ADS)

    Patel, J.; Cho, N. E.

    2005-12-01

    The object of the present paper is to investigate some inclusion properties of certain subclasses of analytic functions defined by using the Noor integral operator. The integral preserving properties in connection with the operator are also considered. Relevant connections of the results presented here with those obtained in earlier works are pointed out.

  1. Integrated Assessment of Prevention and Restoration Actions to Combat Desertification

    NASA Astrophysics Data System (ADS)

    Bautista, S.; Orr, B. J.; Vallejo, R.

    2009-12-01

    Recent advances in desertification and land degradation research have provided valuable conceptual and analytical frameworks, degradation indicators, assessment tools and surveillance systems with respect to desertification drivers, processes, and impacts. These findings, together with stakeholders’ perceptions and local/regional knowledge, have helped to define and propose measures and strategies to combat land degradation. However, integrated and comprehensive assessment and evaluation of prevention and restoration strategies and techniques to combat desertification is still lacking, and knowledge on the feasibility and cost-effectiveness of the proposed strategies over a wide range of environmental and socio-economic conditions is very scarce. To address this challenge, we have launched a multinational project (PRACTICE - Prevention and Restoration Actions to Combat Desertification. An Integrated Assessment), funded by the European Commission, in order to link S & T advances and traditional knowledge on prevention and restoration practices to combat desertification with sound implementation, learning and adaptive management, knowledge sharing, and dissemination of best practices. The key activities for pursuing this goal are (1) to establish a platform and information system of long-term monitoring sites for assessing sustainable management and actions to combat desertification, (2) to define an integrated protocol for the assessment of these actions, and (3) to link project assessment and evaluation with training and education, adaptive management, and knowledge sharing and dissemination through a participatory approach involving scientists, managers, technicians, financial officers, and members of the public who are/were impacted by the desertification control projects. Monitoring sites are distributed in the Mediterranean Europe (Greece, Italy, Spain, and Portugal), Africa (Morocco, Namibia, South Africa), Middle East (Israel), China, and South and North America (Chile, Mexico, and USA). PRACTICE integrated assessment protocol (IAPro) assumes the mutual human-environment interactions in land-use/cover change at multiple scales, and therefore adopts an integrated approach, which simultaneously considers both biophysical and socio-economic attributes, for assessing actions to combat desertification. IAPro mostly relies on critical slow variables and particularly exploits long-term monitoring data. Integration of biophysical and socio-economic assessment indicators and stakeholder preferences is based on a participatory multi-criteria decision making process. The process is iterative and provides a framework for knowledge exchange and a path to consensus building.

  2. Mayday - integrative analytics for expression data

    PubMed Central

    2010-01-01

    Background DNA Microarrays have become the standard method for large scale analyses of gene expression and epigenomics. The increasing complexity and inherent noisiness of the generated data makes visual data exploration ever more important. Fast deployment of new methods as well as a combination of predefined, easy to apply methods with programmer's access to the data are important requirements for any analysis framework. Mayday is an open source platform with emphasis on visual data exploration and analysis. Many built-in methods for clustering, machine learning and classification are provided for dissecting complex datasets. Plugins can easily be written to extend Mayday's functionality in a large number of ways. As Java program, Mayday is platform-independent and can be used as Java WebStart application without any installation. Mayday can import data from several file formats, database connectivity is included for efficient data organization. Numerous interactive visualization tools, including box plots, profile plots, principal component plots and a heatmap are available, can be enhanced with metadata and exported as publication quality vector files. Results We have rewritten large parts of Mayday's core to make it more efficient and ready for future developments. Among the large number of new plugins are an automated processing framework, dynamic filtering, new and efficient clustering methods, a machine learning module and database connectivity. Extensive manual data analysis can be done using an inbuilt R terminal and an integrated SQL querying interface. Our visualization framework has become more powerful, new plot types have been added and existing plots improved. Conclusions We present a major extension of Mayday, a very versatile open-source framework for efficient micro array data analysis designed for biologists and bioinformaticians. Most everyday tasks are already covered. The large number of available plugins as well as the extension possibilities using compiled plugins and ad-hoc scripting allow for the rapid adaption of Mayday also to very specialized data exploration. Mayday is available at http://microarray-analysis.org. PMID:20214778

  3. Integrating the fundamentals of care framework in baccalaureate nursing education: An example from a nursing school in Denmark.

    PubMed

    Voldbjerg, Siri Lygum; Laugesen, Britt; Bahnsen, Iben Bøgh; Jørgensen, Lone; Sørensen, Ingrid Maria; Grønkjaer, Mette; Sørensen, Erik Elgaard

    2018-06-01

    To describe and discuss the process of integrating the Fundamentals of Care framework in a baccalaureate nursing education at a School of Nursing in Denmark. Nursing education plays an essential role in educating nurses to work within healthcare systems in which a demanding workload on nurses results in fundamental nursing care being left undone. Newly graduated nurses often lack knowledge and skills to meet the challenges of delivering fundamental care in clinical practice. To develop nursing students' understanding of fundamental nursing, the conceptual Fundamentals of Care framework has been integrated in nursing education at a School of Nursing in Denmark. Discursive paper using an adjusted descriptive case study design for describing and discussing the process of integrating the conceptual Fundamentals of Care Framework in nursing education. The process of integrating the Fundamentals of Care framework is illuminated through a description of the context, in which the process occurs including the faculty members, lectures, case-based work and simulation laboratory in nursing education. Based on this description, opportunities such as supporting a holistic approach to an evidence-based integrative patient care and challenges such as scepticism among the faculty are discussed. It is suggested how integration of Fundamentals of Care Framework in lectures, case-based work and simulation laboratory can make fundamental nursing care more explicit in nursing education, support critical thinking and underline the relevance of evidence-based practice. The process relies on a supportive context, a well-informed and engaged faculty, and continuous reflections on how the conceptual framework can be integrated. Integrating the Fundamentals of Care framework can support nursing students' critical thinking and reflection on what fundamental nursing care is and requires and eventually educate nurses in providing evidence-based fundamental nursing care. © 2018 John Wiley & Sons Ltd.

  4. Evaluation of Copper-1,3,5-benzenetricarboxylate Metal-organic Framework (Cu-MOF) as a Selective Sorbent for Lewis-base Analytes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Scott D.; Eckberg, Alison D.; Thallapally, Praveen K.

    2011-09-01

    The metal-organic framework Cu-BTC was evaluated for its ability to selectively interact with Lewis-base analytes, including explosives, by examining retention on GC columns packed with Chromosorb W HP that contained 3.0% SE-30 along with various loadings of Cu-BTC. SEM images of the support material showed the characteristic Cu-BTC crystals embedded in the SE-30 coating on the diatomaceous support. Results indicated that the Cu-BTC-containing stationary phase had limited thermal stability (220°C) and strong general retention for analytes. Kováts index calculations showed selective retention (amounting to about 300 Kováts units) relative to n-alkanes for many small Lewis-base analytes on a column thatmore » contained 0.75% Cu-BTC compared to an SE-30 control. Short columns that contained lower loadings of Cu-BTC (0.10%) were necessary to elute explosives and related analytes; however, selectivity was not observed for aromatic compounds (including nitroaromatics) or nitroalkanes. Observed retention characteristics are discussed.« less

  5. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis.

    PubMed

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.

  6. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis

    PubMed Central

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156

  7. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  8. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  9. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  10. Motivation and engagement in mathematics: a qualitative framework for teacher-student interactions

    NASA Astrophysics Data System (ADS)

    Durksen, Tracy L.; Way, Jennifer; Bobis, Janette; Anderson, Judy; Skilling, Karen; Martin, Andrew J.

    2017-02-01

    We started with a classic research question (How do teachers motivate and engage middle year students in mathematics?) that is solidly underpinned and guided by an integration of two theoretical and multidimensional models. In particular, the current study illustrates how theory is important for guiding qualitative analytical approaches to motivation and engagement in mathematics. With little research on how teachers of mathematics are able to maintain high levels of student motivation and engagement, we focused on developing a qualitative framework that highlights the influence of teacher-student interactions. Participants were six teachers (upper primary and secondary) that taught students with higher-than-average levels of motivation and engagement in mathematics. Data sources included one video-recorded lesson and associated transcripts from pre- and post-lesson interviews with each teacher. Overall, effective classroom organisation stood out as a priority when promoting motivation and engagement in mathematics. Results on classroom organisation revealed four key indicators within teacher-student interactions deemed important for motivation and engagement in mathematics—confidence, climate, contact, and connection. Since much of the effect of teachers on student learning relies on interactions, and given the universal trend of declining mathematical performance during the middle years of schooling, future research and intervention studies might be assisted by our qualitative framework.

  11. Analyzing Electronic Question/Answer Services: Framework and Evaluations of Selected Services.

    ERIC Educational Resources Information Center

    White, Marilyn Domas, Ed.

    This report develops an analytical framework based on systems analysis for evaluating electronic question/answer or AskA services operated by a wide range of types of organizations, including libraries. Version 1.0 of this framework was applied in June 1999 to a selective sample of 11 electronic question/answer services, which cover a range of…

  12. Rainbow: A Framework for Analysing Computer-Mediated Pedagogical Debates

    ERIC Educational Resources Information Center

    Baker, Michael; Andriessen, Jerry; Lund, Kristine; van Amelsvoort, Marie; Quignard, Matthieu

    2007-01-01

    In this paper we present a framework for analysing when and how students engage in a specific form of interactive knowledge elaboration in CSCL environments: broadening and deepening understanding of a space of debate. The framework is termed "Rainbow," as it comprises seven principal analytical categories, to each of which a colour is assigned,…

  13. Analysis of Naval NETWAR FORCEnet Enterprise: Implications for Capabilities Based Budgeting

    DTIC Science & Technology

    2006-12-01

    of this background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed...background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed. The...Business Approach ......................................................26 Figure 8. Critical Assumption for Common Analytical Framework

  14. Teacher Identity and Numeracy: Developing an Analytic Lens for Understanding Numeracy Teacher Identity

    ERIC Educational Resources Information Center

    Bennison, Anne; Goos, Merrilyn

    2013-01-01

    This paper reviews recent literature on teacher identity in order to propose an operational framework that can be used to investigate the formation and development of numeracy teacher identities. The proposed framework is based on Van Zoest and Bohl's (2005) framework for mathematics teacher identity with a focus on those characteristics thought…

  15. Green Framework and Its Role in Sustainable City Development (by Example of Yekaterinburg)

    NASA Astrophysics Data System (ADS)

    Maltseva, A.

    2017-11-01

    The article focuses on the destruction of the city green framework in Yekaterinburg. The strategy of its recovery by means of a bioactive core represented by a botanic garden has been proposed. The analytical framework for modification in the proportion of green territories and the total city area has been described.

  16. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework.

    PubMed

    Khazaei, Hamzeh; McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-11-18

    Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids' NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution.

  17. Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework

    PubMed Central

    McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil

    2015-01-01

    Background Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. Objective To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. Methods We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). Results We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids’ NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Conclusions Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution. PMID:26582268

  18. Efficient and precise calculation of the b-matrix elements in diffusion-weighted imaging pulse sequences.

    PubMed

    Zubkov, Mikhail; Stait-Gardner, Timothy; Price, William S

    2014-06-01

    Precise NMR diffusion measurements require detailed knowledge of the cumulative dephasing effect caused by the numerous gradient pulses present in most NMR pulse sequences. This effect, which ultimately manifests itself as the diffusion-related NMR signal attenuation, is usually described by the b-value or the b-matrix in the case of multidirectional diffusion weighting, the latter being common in diffusion-weighted NMR imaging. Neglecting some of the gradient pulses introduces an error in the calculated diffusion coefficient reaching in some cases 100% of the expected value. Therefore, ensuring the b-matrix calculation includes all the known gradient pulses leads to significant error reduction. Calculation of the b-matrix for simple gradient waveforms is rather straightforward, yet it grows cumbersome when complexly shaped and/or numerous gradient pulses are introduced. Making three broad assumptions about the gradient pulse arrangement in a sequence results in an efficient framework for calculation of b-matrices as well providing some insight into optimal gradient pulse placement. The framework allows accounting for the diffusion-sensitising effect of complexly shaped gradient waveforms with modest computational time and power. This is achieved by using the b-matrix elements of the simple unmodified pulse sequence and minimising the integration of the complexly shaped gradient waveform in the modified sequence. Such re-evaluation of the b-matrix elements retains all the analytical relevance of the straightforward approach, yet at least halves the amount of symbolic integration required. The application of the framework is demonstrated with the evaluation of the expression describing the diffusion-sensitizing effect, caused by different bipolar gradient pulse modules. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Multidisciplinary perspectives: Application of the Consolidated Framework for Implementation Research to evaluate a health coaching initiative.

    PubMed

    Brook, Judy; McGraw, Caroline

    2018-05-01

    Long-term conditions are a leading cause of mortality and morbidity. Their management is founded on a combination of approaches involving government policy, better integration between health and care systems, and individual responsibility for self-care. Health coaching has emerged as an approach to encouraging individual responsibility and enhancing the self-management of long-term conditions. This paper focuses on the evaluation of a workforce initiative in a diverse and socially deprived community. The initiative sought both to improve integration between health and care services for people with long-term conditions, and equip practitioners with health coaching skills. The aim of the study was to contribute an empirical understanding of what practitioners perceive to be the contextual factors that impact on the adoption of health coaching in community settings. These factors were conceptualised using the Consolidated Framework for Implementation Research (CFIR). A stratified purposive sample of 22 health and care practitioners took part in semi-structured telephone interviews. Data were analysed using the CFIR as an analytical framework. The perceptions of trainees mapped onto the major domains of the CFIR: characteristics of the intervention, outer setting, inner setting, characteristics of individuals involved and process of implementation. Individual patient expectations, comorbidities and social context were central to the extent to which practitioners and patients engaged with health coaching. Structural constraints within provider services and the wider NHS were also reported as discouraging initiatives that focused on long-term rewards rather than short-term wins. The authors recommend further research is undertaken both to understand the role of health coaching in disadvantaged communities and ensure the service user voice is heard. © 2018 John Wiley & Sons Ltd.

  20. Web-Based Geospatial Visualization of GPM Data with CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matt

    2018-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology have made online visualization of large geospatial datasets such as those coming from precipitation satellites viable. These data benefit from being visualized on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS (http://cesiumjs.org), developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. This presentation will describe how CesiumJS has been used in three-dimensional visualization products developed as part of the NASA Precipitation Processing System (PPS) STORM data-order website. Existing methods of interacting with Global Precipitation Measurement (GPM) Mission data primarily focus on two-dimensional static images, whether displaying vertical slices or horizontal surface/height-level maps. These methods limit interactivity with the robust three-dimensional data coming from the GPM core satellite. Integrating the data with CesiumJS in a web-based user interface has allowed us to create the following products. We have linked with the data-order interface an on-the-fly visualization tool for any GPM/partner satellite orbit. A version of this tool also focuses on high-impact weather events. It enables viewing of combined radar and microwave-derived precipitation data on mobile devices and in a way that can be embedded into other websites. We also have used CesiumJS to visualize a method of integrating gridded precipitation data with modeled wind speeds that animates over time. Emphasis in the presentation will be placed on how a variety of technical methods were used to create these tools, and how the flexibility of the CesiumJS framework facilitates creative approaches to interact with the data.

  1. Distinctive aspects of the evolution of galactic magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yar-Mukhamedov, D., E-mail: danial.su@gmail.com

    2016-11-15

    We perform an in-depth analysis of the evolution of galactic magnetic fields within a semi-analytic galaxy formation and evolution framework, determine various distinctive aspects of the evolution process, and obtain analytic solutions for a wide range of possible evolution scenarios.

  2. Strategic, Analytic and Operational Domains of Information Management.

    ERIC Educational Resources Information Center

    Diener, Richard AV

    1992-01-01

    Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…

  3. Climate change and nutrition: creating a climate for nutrition security.

    PubMed

    Tirado, M C; Crahay, P; Mahy, L; Zanev, C; Neira, M; Msangi, S; Brown, R; Scaramella, C; Costa Coitinho, D; Müller, A

    2013-12-01

    Climate change further exacerbates the enormous existing burden of undernutrition. It affects food and nutrition security and undermines current efforts to reduce hunger and promote nutrition. Undernutrition in turn undermines climate resilience and the coping strategies of vulnerable populations. The objectives of this paper are to identify and undertake a cross-sectoral analysis of the impacts of climate change on nutrition security and the existing mechanisms, strategies, and policies to address them. A cross-sectoral analysis of the impacts of climate change on nutrition security and the mechanisms and policies to address them was guided by an analytical framework focused on the three 'underlying causes' of undernutrition: 1) household food access, 2) maternal and child care and feeding practices, 3) environmental health and health access. The analytical framework includes the interactions of the three underlying causes of undernutrition with climate change,vulnerability, adaptation and mitigation. Within broad efforts on climate change mitigation and adaptation and climate-resilient development, a combination of nutrition-sensitive adaptation and mitigation measures, climate-resilient and nutrition-sensitive agricultural development, social protection, improved maternal and child care and health, nutrition-sensitive risk reduction and management, community development measures, nutrition-smart investments, increased policy coherence, and institutional and cross-sectoral collaboration are proposed as a means to address the impacts of climate change to food and nutrition security. This paper proposes policy directions to address nutrition in the climate change agenda and recommendations for consideration by the UN Framework Convention on Climate Change (UNFCCC). Nutrition and health stakeholders need to be engaged in key climate change adaptation and mitigation initiatives, including science-based assessment by the Intergovernmental Panel on Climate Change (IPCC), and policies and actions formulated by the UN Framework Convention on Climate Change (UNFCCC). Improved multi-sectoral coordination and political will is required to integrate nutrition-sensitive actions into climate-resilient sustainable development efforts in the UNFCCC work and in the post 2015 development agenda. Placing human rights at the center of strategies to mitigate and adapt to the impacts of climate change and international solidarity is essential to advance sustainable development and to create a climate for nutrition security.

  4. Non-linear analytic and coanalytic problems ( L_p-theory, Clifford analysis, examples)

    NASA Astrophysics Data System (ADS)

    Dubinskii, Yu A.; Osipenko, A. S.

    2000-02-01

    Two kinds of new mathematical model of variational type are put forward: non-linear analytic and coanalytic problems. The formulation of these non-linear boundary-value problems is based on a decomposition of the complete scale of Sobolev spaces into the "orthogonal" sum of analytic and coanalytic subspaces. A similar decomposition is considered in the framework of Clifford analysis. Explicit examples are presented.

  5. Integrating landscape system and meta-ecosystem frameworks to advance the understanding of ecosystem function in heterogeneous landscapes: An analysis on the carbon fluxes in the Northern Highlands Lake District (NHLD) of Wisconsin and Michigan.

    PubMed

    Yang, Haile; Chen, Jiakuan

    2018-01-01

    The successful integration of ecosystem ecology with landscape ecology would be conducive to understanding how landscapes function. There have been several attempts at this, with two main approaches: (1) an ecosystem-based approach, such as the meta-ecosystem framework and (2) a landscape-based approach, such as the landscape system framework. These two frameworks are currently disconnected. To integrate these two frameworks, we introduce a protocol, and then demonstrate application of the protocol using a case study. The protocol includes four steps: 1) delineating landscape systems; 2) classifying landscape systems; 3) adjusting landscape systems to meta-ecosystems and 4) integrating landscape system and meta-ecosystem frameworks through meta-ecosystems. The case study is the analyzing of the carbon fluxes in the Northern Highlands Lake District (NHLD) of Wisconsin and Michigan using this protocol. The application of this protocol revealed that one could follow this protocol to construct a meta-ecosystem and analyze it using the integrative framework of landscape system and meta-ecosystem frameworks. That is, one could (1) appropriately describe and analyze the spatial heterogeneity of the meta-ecosystem; (2) understand the emergent properties arising from spatial coupling of local ecosystems in the meta-ecosystem. In conclusion, this protocol is a useful approach for integrating the meta-ecosystem framework and the landscape system framework, which advances the describing and analyzing of the spatial heterogeneity and ecosystem function of interconnected ecosystems.

  6. Integrating landscape system and meta-ecosystem frameworks to advance the understanding of ecosystem function in heterogeneous landscapes: An analysis on the carbon fluxes in the Northern Highlands Lake District (NHLD) of Wisconsin and Michigan

    PubMed Central

    Chen, Jiakuan

    2018-01-01

    The successful integration of ecosystem ecology with landscape ecology would be conducive to understanding how landscapes function. There have been several attempts at this, with two main approaches: (1) an ecosystem-based approach, such as the meta-ecosystem framework and (2) a landscape-based approach, such as the landscape system framework. These two frameworks are currently disconnected. To integrate these two frameworks, we introduce a protocol, and then demonstrate application of the protocol using a case study. The protocol includes four steps: 1) delineating landscape systems; 2) classifying landscape systems; 3) adjusting landscape systems to meta-ecosystems and 4) integrating landscape system and meta-ecosystem frameworks through meta-ecosystems. The case study is the analyzing of the carbon fluxes in the Northern Highlands Lake District (NHLD) of Wisconsin and Michigan using this protocol. The application of this protocol revealed that one could follow this protocol to construct a meta-ecosystem and analyze it using the integrative framework of landscape system and meta-ecosystem frameworks. That is, one could (1) appropriately describe and analyze the spatial heterogeneity of the meta-ecosystem; (2) understand the emergent properties arising from spatial coupling of local ecosystems in the meta-ecosystem. In conclusion, this protocol is a useful approach for integrating the meta-ecosystem framework and the landscape system framework, which advances the describing and analyzing of the spatial heterogeneity and ecosystem function of interconnected ecosystems. PMID:29415066

  7. Comparison of Physics Frameworks for WebGL-Based Game Engine

    NASA Astrophysics Data System (ADS)

    Yogya, Resa; Kosala, Raymond

    2014-03-01

    Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.

  8. Architectural Strategies for Enabling Data-Driven Science at Scale

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  9. A joint ERS/ATS policy statement: what constitutes an adverse health effect of air pollution? An analytical framework.

    PubMed

    Thurston, George D; Kipen, Howard; Annesi-Maesano, Isabella; Balmes, John; Brook, Robert D; Cromar, Kevin; De Matteis, Sara; Forastiere, Francesco; Forsberg, Bertil; Frampton, Mark W; Grigg, Jonathan; Heederik, Dick; Kelly, Frank J; Kuenzli, Nino; Laumbach, Robert; Peters, Annette; Rajagopalan, Sanjay T; Rich, David; Ritz, Beate; Samet, Jonathan M; Sandstrom, Thomas; Sigsgaard, Torben; Sunyer, Jordi; Brunekreef, Bert

    2017-01-01

    The American Thoracic Society has previously published statements on what constitutes an adverse effect on health of air pollution in 1985 and 2000. We set out to update and broaden these past statements that focused primarily on effects on the respiratory system. Since then, many studies have documented effects of air pollution on other organ systems, such as on the cardiovascular and central nervous systems. In addition, many new biomarkers of effects have been developed and applied in air pollution studies.This current report seeks to integrate the latest science into a general framework for interpreting the adversity of the human health effects of air pollution. Rather than trying to provide a catalogue of what is and what is not an adverse effect of air pollution, we propose a set of considerations that can be applied in forming judgments of the adversity of not only currently documented, but also emerging and future effects of air pollution on human health. These considerations are illustrated by the inclusion of examples for different types of health effects of air pollution. Copyright ©ERS 2017.

  10. A joint ERS/ATS policy statement: what constitutes an adverse health effect of air pollution? An analytical framework

    PubMed Central

    Thurston, George D.; Kipen, Howard; Annesi-Maesano, Isabella; Balmes, John; Brook, Robert D.; Cromar, Kevin; De Matteis, Sara; Forastiere, Francesco; Forsberg, Bertil; Frampton, Mark W.; Grigg, Jonathan; Heederik, Dick; Kelly, Frank J.; Kuenzli, Nino; Laumbach, Robert; Peters, Annette; Rajagopalan, Sanjay T.; Rich, David; Ritz, Beate; Samet, Jonathan M.; Sandstrom, Thomas; Sigsgaard, Torben; Sunyer, Jordi; Brunekreef, Bert

    2017-01-01

    The American Thoracic Society has previously published statements on what constitutes an adverse effect on health of air pollution in 1985 and 2000. We set out to update and broaden these past statements that focused primarily on effects on the respiratory system. Since then, many studies have documented effects of air pollution on other organ systems, such as on the cardiovascular and central nervous systems. In addition, many new biomarkers of effects have been developed and applied in air pollution studies. This current report seeks to integrate the latest science into a general framework for interpreting the adversity of the human health effects of air pollution. Rather than trying to provide a catalogue of what is and what is not an adverse effect of air pollution, we propose a set of considerations that can be applied in forming judgments of the adversity of not only currently documented, but also emerging and future effects of air pollution on human health. These considerations are illustrated by the inclusion of examples for different types of health effects of air pollution. PMID:28077473

  11. Multi-Regge kinematics and the moduli space of Riemann spheres with marked points

    DOE PAGES

    Del Duca, Vittorio; Druc, Stefan; Drummond, James; ...

    2016-08-25

    We show that scattering amplitudes in planar N = 4 Super Yang-Mills in multi-Regge kinematics can naturally be expressed in terms of single-valued iterated integrals on the moduli space of Riemann spheres with marked points. As a consequence, scattering amplitudes in this limit can be expressed as convolutions that can easily be computed using Stokes’ theorem. We apply this framework to MHV amplitudes to leading-logarithmic accuracy (LLA), and we prove that at L loops all MHV amplitudes are determined by amplitudes with up to L + 4 external legs. We also investigate non-MHV amplitudes, and we show that they canmore » be obtained by convoluting the MHV results with a certain helicity flip kernel. We classify all leading singularities that appear at LLA in the Regge limit for arbitrary helicity configurations and any number of external legs. In conclusion, we use our new framework to obtain explicit analytic results at LLA for all MHV amplitudes up to five loops and all non-MHV amplitudes with up to eight external legs and four loops.« less

  12. A Visual Analytics Framework for Identifying Topic Drivers in Media Events.

    PubMed

    Lu, Yafeng; Wang, Hong; Landis, Steven; Maciejewski, Ross

    2017-09-14

    Media data has been the subject of large scale analysis with applications of text mining being used to provide overviews of media themes and information flows. Such information extracted from media articles has also shown its contextual value of being integrated with other data, such as criminal records and stock market pricing. In this work, we explore linking textual media data with curated secondary textual data sources through user-guided semantic lexical matching for identifying relationships and data links. In this manner, critical information can be identified and used to annotate media timelines in order to provide a more detailed overview of events that may be driving media topics and frames. These linked events are further analyzed through an application of causality modeling to model temporal drivers between the data series. Such causal links are then annotated through automatic entity extraction which enables the analyst to explore persons, locations, and organizations that may be pertinent to the media topic of interest. To demonstrate the proposed framework, two media datasets and an armed conflict event dataset are explored.

  13. A Compressive Sensing Approach for Glioma Margin Delineation Using Mass Spectrometry

    PubMed Central

    Gholami, Behnood; Agar, Nathalie Y. R.; Jolesz, Ferenc A.; Haddad, Wassim M.; Tannenbaum, Allen R.

    2013-01-01

    Surgery, and specifically, tumor resection, is the primary treatment for most patients suffering from brain tumors. Medical imaging techniques, and in particular, magnetic resonance imaging are currently used in diagnosis as well as image-guided surgery procedures. However, studies show that computed tomography and magnetic resonance imaging fail to accurately identify the full extent of malignant brain tumors and their microscopic infiltration. Mass spectrometry is a well-known analytical technique used to identify molecules in a given sample based on their mass. In a recent study, it is proposed to use mass spectrometry as an intraoperative tool for discriminating tumor and non-tumor tissue. Integration of mass spectrometry with the resection module allows for tumor resection and immediate molecular analysis. In this paper, we propose a framework for tumor margin delineation using compressive sensing. Specifically, we show that the spatial distribution of tumor cell concentration can be efficiently reconstructed and updated using mass spectrometry information from the resected tissue. In addition, our proposed framework is model-free, and hence, requires no prior information of spatial distribution of the tumor cell concentration. PMID:22255629

  14. Stress-induced chemical detection using flexible metal-organic frameworks.

    PubMed

    Allendorf, Mark D; Houk, Ronald J T; Andruszkiewicz, Leanne; Talin, A Alec; Pikarsky, Joel; Choudhury, Arnab; Gall, Kenneth A; Hesketh, Peter J

    2008-11-05

    In this work we demonstrate the concept of stress-induced chemical detection using metal-organic frameworks (MOFs) by integrating a thin film of the MOF HKUST-1 with a microcantilever surface. The results show that the energy of molecular adsorption, which causes slight distortions in the MOF crystal structure, can be converted to mechanical energy to create a highly responsive, reversible, and selective sensor. This sensor responds to water, methanol, and ethanol vapors, but yields no response to either N2 or O2. The magnitude of the signal, which is measured by a built-in piezoresistor, is correlated with the concentration and can be fitted to a Langmuir isotherm. Furthermore, we show that the hydration state of the MOF layer can be used to impart selectivity to CO2. Finally, we report the first use of surface-enhanced Raman spectroscopy to characterize the structure of a MOF film. We conclude that the synthetic versatility of these nanoporous materials holds great promise for creating recognition chemistries to enable selective detection of a wide range of analytes.

  15. High Possibility Classrooms as a Pedagogical Framework for Technology Integration in Classrooms: An Inquiry in Two Australian Secondary Schools

    ERIC Educational Resources Information Center

    Hunter, Jane

    2017-01-01

    Understanding how well teachers integrate digital technology in learning is the subject of considerable debate in education. High Possibility Classrooms (HPC) is a pedagogical framework drawn from research on exemplary teachers' knowledge of technology integration in Australian school classrooms. The framework is being used to support teachers who…

  16. Experimental and analytical investigations of the piles and abutments of integral bridges.

    DOT National Transportation Integrated Search

    2002-01-01

    This research investigated, through experimental and analytical studies, the complex interactions that take place between the structural components of an integral bridge and the adjoining soil. The ability of piles and abutments to withstand thermall...

  17. Pilot testing of SHRP 2 reliability data and analytical products: Florida. [supporting datasets

    DOT National Transportation Integrated Search

    2014-01-01

    SHRP 2 initiated the L38 project to pilot test products from five of the programs completed projects. The products support reliability estimation and use based on data analyses, analytical techniques, and decision-making framework. The L38 project...

  18. Joseph v. Brady: Synthesis Reunites What Analysis Has Divided

    ERIC Educational Resources Information Center

    Thompson, Travis

    2012-01-01

    Joseph V. Brady (1922-2011) created behavior-analytic neuroscience and the analytic framework for understanding how the external and internal neurobiological environments and mechanisms interact. Brady's approach offered synthesis as well as analysis. He embraced Findley's approach to constructing multioperant behavioral repertoires that found…

  19. Inverse scattering transform analysis of rogue waves using local periodization procedure

    NASA Astrophysics Data System (ADS)

    Randoux, Stéphane; Suret, Pierre; El, Gennady

    2016-07-01

    The nonlinear Schrödinger equation (NLSE) stands out as the dispersive nonlinear partial differential equation that plays a prominent role in the modeling and understanding of the wave phenomena relevant to many fields of nonlinear physics. The question of random input problems in the one-dimensional and integrable NLSE enters within the framework of integrable turbulence, and the specific question of the formation of rogue waves (RWs) has been recently extensively studied in this context. The determination of exact analytic solutions of the focusing 1D-NLSE prototyping RW events of statistical relevance is now considered as the problem of central importance. Here we address this question from the perspective of the inverse scattering transform (IST) method that relies on the integrable nature of the wave equation. We develop a conceptually new approach to the RW classification in which appropriate, locally coherent structures are specifically isolated from a globally incoherent wave train to be subsequently analyzed by implementing a numerical IST procedure relying on a spatial periodization of the object under consideration. Using this approach we extend the existing classifications of the prototypes of RWs from standard breathers and their collisions to more general nonlinear modes characterized by their nonlinear spectra.

  20. Poisson-Lie duals of the η deformed symmetric space sigma model

    NASA Astrophysics Data System (ADS)

    Hoare, Ben; Seibold, Fiona K.

    2017-11-01

    Poisson-Lie dualising the η deformation of the G/H symmetric space sigma model with respect to the simple Lie group G is conjectured to give an analytic continuation of the associated λ deformed model. In this paper we investigate when the η deformed model can be dualised with respect to a subgroup G0 of G. Starting from the first-order action on the complexified group and integrating out the degrees of freedom associated to different subalgebras, we find it is possible to dualise when G0 is associated to a sub-Dynkin diagram. Additional U1 factors built from the remaining Cartan generators can also be included. The resulting construction unifies both the Poisson-Lie dual with respect to G and the complete abelian dual of the η deformation in a single framework, with the integrated algebras unimodular in both cases. We speculate that extending these results to the path integral formalism may provide an explanation for why the η deformed AdS5 × S5 superstring is not one-loop Weyl invariant, that is the couplings do not solve the equations of type IIB supergravity, yet its complete abelian dual and the λ deformed model are.

  1. Inverse scattering transform analysis of rogue waves using local periodization procedure

    PubMed Central

    Randoux, Stéphane; Suret, Pierre; El, Gennady

    2016-01-01

    The nonlinear Schrödinger equation (NLSE) stands out as the dispersive nonlinear partial differential equation that plays a prominent role in the modeling and understanding of the wave phenomena relevant to many fields of nonlinear physics. The question of random input problems in the one-dimensional and integrable NLSE enters within the framework of integrable turbulence, and the specific question of the formation of rogue waves (RWs) has been recently extensively studied in this context. The determination of exact analytic solutions of the focusing 1D-NLSE prototyping RW events of statistical relevance is now considered as the problem of central importance. Here we address this question from the perspective of the inverse scattering transform (IST) method that relies on the integrable nature of the wave equation. We develop a conceptually new approach to the RW classification in which appropriate, locally coherent structures are specifically isolated from a globally incoherent wave train to be subsequently analyzed by implementing a numerical IST procedure relying on a spatial periodization of the object under consideration. Using this approach we extend the existing classifications of the prototypes of RWs from standard breathers and their collisions to more general nonlinear modes characterized by their nonlinear spectra. PMID:27385164

  2. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  3. Quantifying complexity in translational research: an integrated approach.

    PubMed

    Munoz, David A; Nembhard, Harriet Black; Kraschnewski, Jennifer L

    2014-01-01

    The purpose of this paper is to quantify complexity in translational research. The impact of major operational steps and technical requirements is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. A three-phase integrated quality function deployment (QFD) and analytic hierarchy process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, the authors found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research.

  4. Connected health and integrated care: Toward new models for chronic disease management.

    PubMed

    Chouvarda, Ioanna G; Goulis, Dimitrios G; Lambrinoudaki, Irene; Maglaveras, Nicos

    2015-09-01

    The increasingly aging population in Europe and worldwide brings up the need for the restructuring of healthcare. Technological advancements in electronic health can be a driving force for new health management models, especially in chronic care. In a patient-centered e-health management model, communication and coordination between patient, healthcare professionals in primary care and hospitals can be facilitated, and medical decisions can be made timely and easily communicated. Bringing the right information to the right person at the right time is what connected health aims at, and this may set the basis for the investigation and deployment of the integrated care models. In this framework, an overview of the main technological axes and challenges around connected health technologies in chronic disease management are presented and discussed. A central concept is personal health system for the patient/citizen and three main application areas are identified. The connected health ecosystem is making progress, already shows benefits in (a) new biosensors, (b) data management, (c) data analytics, integration and feedback. Examples are illustrated in each case, while open issues and challenges for further research and development are pinpointed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. In itinere strategic environmental assessment of an integrated provincial waste system.

    PubMed

    Federico, Giovanna; Rizzo, Gianfranco; Traverso, Marzia

    2009-06-01

    In the paper, the practical problem of analysing in an integrated way the performance of provincial waste systems is approached, in the framework of the Strategic Environmental Assessment (SEA). In particular, the in itinere phase of SEA is analysed herein. After separating out a proper group of ambits, to which the waste system is supposed to determine relevant impacts, pertinent sets of single indicators are proposed. Through the adoption of such indicators the time trend of the system is investigated, and the suitability of each indicator is critically revised. The structure of the evaluation scheme, which is essentially based on the use of ambit issues and analytical indicators, calls for the application of the method of the Dashboard of Sustainability for the integrated evaluation of the whole system. The suitability of this method is shown through the paper, together with the possibility of a comparative analysis of different scenarios of interventions. Of course, the reliability of the proposed method strongly relies on the availability of a detailed set of territorial data. The method appears to represent a useful tool for public administration in the process of optimizing the policy actions aimed at minimizing the increasing problem represented by waste production in urban areas.

  6. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    PubMed

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  7. The importance of an integrating framework for achieving the Sustainable Development Goals: the example of health and well-being.

    PubMed

    Nunes, Ana Raquel; Lee, Kelley; O'Riordan, Tim

    2016-01-01

    The 2030 Agenda for Sustainable Development came into force in January 2016 as the central United Nations (UN) platform for achieving 'integrated and indivisible' goals and targets across the three characteristic dimensions of sustainable development: the social, environmental and economic. We argue that, despite the UN adoption of the Sustainable Development Goals (SDGs), a framework for operationalising them in an integrated fashion is lacking. This article puts forth a framework for integrating health and well-being across the SDGs as both preconditions and outcomes of sustainable development. We present a rationale for this approach, and identify the challenges and opportunities for implementing and monitoring such a framework through a series of examples. We encourage other sectors to develop similar integrating frameworks for supporting a more coordinated approach for operationalising the 2030 Agenda for Sustainable Development.

  8. The importance of an integrating framework for achieving the Sustainable Development Goals: the example of health and well-being

    PubMed Central

    Lee, Kelley; O'Riordan, Tim

    2016-01-01

    The 2030 Agenda for Sustainable Development came into force in January 2016 as the central United Nations (UN) platform for achieving ‘integrated and indivisible’ goals and targets across the three characteristic dimensions of sustainable development: the social, environmental and economic. We argue that, despite the UN adoption of the Sustainable Development Goals (SDGs), a framework for operationalising them in an integrated fashion is lacking. This article puts forth a framework for integrating health and well-being across the SDGs as both preconditions and outcomes of sustainable development. We present a rationale for this approach, and identify the challenges and opportunities for implementing and monitoring such a framework through a series of examples. We encourage other sectors to develop similar integrating frameworks for supporting a more coordinated approach for operationalising the 2030 Agenda for Sustainable Development. PMID:28588955

  9. Risk Assessment and Risk Governance of Liquefied Natural Gas Development in Gladstone, Australia.

    PubMed

    van der Vegt, R G

    2018-02-26

    This article is a retrospective analysis of liquefied natural gas development (LNG) in Gladstone, Australia by using the structure of the risk governance framework developed by the International Risk Governance Council (IRGC). Since 2010 the port of Gladstone has undergone extensive expansion to facilitate the increasing coal export as well as the new development of three recently completed LNG facilities. Significant environmental and socio-economic impacts and concerns have occurred as a result of these developments. The overall aim of the article, therefore, is to identify the risk governance deficits that arose and to formulate processes capable of improving similar decision-making problems in the future. The structure of the IRGC framework is followed because it represents a broad analytical approach for considering risk assessment and risk governance in Gladstone in ways that include, but also go beyond, the risk approach of the ISO 31000:2009 standard that was employed at the time. The IRGC risk framework is argued to be a consistent and comprehensive risk governance framework that integrates scientific, economic, social, and cultural aspects and advocates the notion of inclusive risk governance through stakeholder communication and involvement. Key aspects related to risk preassessment, risk appraisal, risk tolerability and acceptability, risk management, and stakeholder communication and involvement are considered. The results indicate that the risk governance deficits include aspects related to (i) the risk matrix methodology, (ii) reflecting uncertainties, (iii) cumulative risks, (iv) the regulatory process, and (v) stakeholder communication and involvement. © 2018 Society for Risk Analysis.

  10. Medication information leaflets for patients: the further validation of an analytic linguistic framework.

    PubMed

    Clerehan, Rosemary; Hirsh, Di; Buchbinder, Rachelle

    2009-01-01

    While clinicians may routinely use patient information leaflets about drug therapy, a poorly conceived leaflet has the potential to do harm. We previously developed a novel approach to analysing leaflets about a rheumatoid arthritis drug, using an analytic approach based on systemic functional linguistics. The aim of the present study was to verify the validity of the linguistic framework by applying it to two further arthritis drug leaflets. The findings confirmed the applicability of the framework and were used to refine it. A new stage or 'move' in the genre was identified. While the function of many of the moves appeared to be 'to instruct' the patient, the instruction was often unclear. The role relationships expressed in the text were critical to the meaning. As with our previous study, judged on their lexical density, the leaflets resembled academic text. The framework can provide specific tools to assess and produce medication information leaflets to support readers in taking medication. Future work could utilize the framework to evaluate information on other treatments and procedures or on healthcare information more widely.

  11. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  12. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance

    PubMed Central

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405

  13. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance.

    PubMed

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.

  14. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE PAGES

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...

    2016-01-28

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  15. Big data analytics in healthcare: promise and potential.

    PubMed

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  16. Micromechanics Analysis Code (MAC) User Guide: Version 1.0

    NASA Technical Reports Server (NTRS)

    Wilt, T. E.; Arnold, S. M.

    1994-01-01

    The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code, MAC, who's predictive capability rests entirely upon the fully analytical generalized method of cells, GMC, micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triple ply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control), and thermomechanical load histories can be imposed; (2) different integration algorithms may be selected; (3) a variety of constituent constitutive models may be utilized and/or implemented; and (4) a variety of fiber architectures may be easily accessed through their corresponding representative volume elements.

  17. Micromechanics Analysis Code (MAC). User Guide: Version 2.0

    NASA Technical Reports Server (NTRS)

    Wilt, T. E.; Arnold, S. M.

    1996-01-01

    The ability to accurately predict the thermomechanical deformation response of advanced composite materials continues to play an important role in the development of these strategic materials. Analytical models that predict the effective behavior of composites are used not only by engineers performing structural analysis of large-scale composite components but also by material scientists in developing new material systems. For an analytical model to fulfill these two distinct functions it must be based on a micromechanics approach which utilizes physically based deformation and life constitutive models and allows one to generate the average (macro) response of a composite material given the properties of the individual constituents and their geometric arrangement. Here the user guide for the recently developed, computationally efficient and comprehensive micromechanics analysis code's (MAC) who's predictive capability rests entirely upon the fully analytical generalized method of cells (GMC), micromechanics model is described. MAC is a versatile form of research software that 'drives' the double or triply periodic micromechanics constitutive models based upon GMC. MAC enhances the basic capabilities of GMC by providing a modular framework wherein (1) various thermal, mechanical (stress or strain control) and thermomechanical load histories can be imposed, (2) different integration algorithms may be selected, (3) a variety of constituent constitutive models may be utilized and/or implemented, and (4) a variety of fiber and laminate architectures may be easily accessed through their corresponding representative volume elements.

  18. PlanetSense: A Real-time Streaming and Spatio-temporal Analytics Platform for Gathering Geo-spatial Intelligence from Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O

    Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less

  19. Using the Analytic Hierarchy Process (AHP) to understand the most important factors to design and evaluate a telehealth system for Parkinson's disease.

    PubMed

    Cancela, Jorge; Fico, Giuseppe; Arredondo Waldmeyer, Maria T

    2015-01-01

    The assessment of a new health technology is a multidisciplinary and multidimensional process, which requires a complex analysis and the convergence of different stakeholders into a common decision. This task is even more delicate when the assessment is carried out in early stage of development processes, when the maturity of the technology prevents conducting a large scale trials to evaluate the cost effectiveness through classic health economics methods. This lack of information may limit the future development and deployment in the clinical practice. This work aims to 1) identify the most relevant user needs of a new medical technology for managing and monitoring Parkinson's Disease (PD) patients and to 2) use these user needs for a preliminary assessment of a specific system called PERFORM, as a case study. Analytic Hierarchy Process (AHP) was used to design a hierarchy of 17 needs, grouped into 5 categories. A total of 16 experts, 6 of them with a clinical background and the remaining 10 with a technical background, were asked to rank these needs and categories. On/Off fluctuations detection, Increase wearability acceptance, and Increase self-management support have been identified as the most relevant user needs. No significant differences were found between the clinician and technical groups. These results have been used to evaluate the PERFORM system and to identify future areas of improvement. First of all, the AHP contributed to the elaboration of a unified hierarchy, integrating the needs of a variety of stakeholders, promoting the discussion and the agreement into a common framework of evaluation. Moreover, the AHP effectively supported the user need elicitation as well as the assignment of different weights and priorities to each need and, consequently, it helped to define a framework for the assessment of telehealth systems for PD management and monitoring. This framework can be used to support the decision-making process for the adoption of new technologies in PD.

  20. Using the Analytic Hierarchy Process (AHP) to understand the most important factors to design and evaluate a telehealth system for Parkinson's disease

    PubMed Central

    2015-01-01

    Background The assessment of a new health technology is a multidisciplinary and multidimensional process, which requires a complex analysis and the convergence of different stakeholders into a common decision. This task is even more delicate when the assessment is carried out in early stage of development processes, when the maturity of the technology prevents conducting a large scale trials to evaluate the cost effectiveness through classic health economics methods. This lack of information may limit the future development and deployment in the clinical practice. This work aims to 1) identify the most relevant user needs of a new medical technology for managing and monitoring Parkinson's Disease (PD) patients and to 2) use these user needs for a preliminary assessment of a specific system called PERFORM, as a case study. Methods Analytic Hierarchy Process (AHP) was used to design a hierarchy of 17 needs, grouped into 5 categories. A total of 16 experts, 6 of them with a clinical background and the remaining 10 with a technical background, were asked to rank these needs and categories. Results On/Off fluctuations detection, Increase wearability acceptance, and Increase self-management support have been identified as the most relevant user needs. No significant differences were found between the clinician and technical groups. These results have been used to evaluate the PERFORM system and to identify future areas of improvement. Conclusions First of all, the AHP contributed to the elaboration of a unified hierarchy, integrating the needs of a variety of stakeholders, promoting the discussion and the agreement into a common framework of evaluation. Moreover, the AHP effectively supported the user need elicitation as well as the assignment of different weights and priorities to each need and, consequently, it helped to define a framework for the assessment of telehealth systems for PD management and monitoring. This framework can be used to support the decision-making process for the adoption of new technologies in PD. PMID:26391847

Top