Voldbjerg, Siri Lygum; Laugesen, Britt; Bahnsen, Iben Bøgh; Jørgensen, Lone; Sørensen, Ingrid Maria; Grønkjaer, Mette; Sørensen, Erik Elgaard
2018-06-01
To describe and discuss the process of integrating the Fundamentals of Care framework in a baccalaureate nursing education at a School of Nursing in Denmark. Nursing education plays an essential role in educating nurses to work within healthcare systems in which a demanding workload on nurses results in fundamental nursing care being left undone. Newly graduated nurses often lack knowledge and skills to meet the challenges of delivering fundamental care in clinical practice. To develop nursing students' understanding of fundamental nursing, the conceptual Fundamentals of Care framework has been integrated in nursing education at a School of Nursing in Denmark. Discursive paper using an adjusted descriptive case study design for describing and discussing the process of integrating the conceptual Fundamentals of Care Framework in nursing education. The process of integrating the Fundamentals of Care framework is illuminated through a description of the context, in which the process occurs including the faculty members, lectures, case-based work and simulation laboratory in nursing education. Based on this description, opportunities such as supporting a holistic approach to an evidence-based integrative patient care and challenges such as scepticism among the faculty are discussed. It is suggested how integration of Fundamentals of Care Framework in lectures, case-based work and simulation laboratory can make fundamental nursing care more explicit in nursing education, support critical thinking and underline the relevance of evidence-based practice. The process relies on a supportive context, a well-informed and engaged faculty, and continuous reflections on how the conceptual framework can be integrated. Integrating the Fundamentals of Care framework can support nursing students' critical thinking and reflection on what fundamental nursing care is and requires and eventually educate nurses in providing evidence-based fundamental nursing care. © 2018 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les
1991-01-01
The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.
Ergonomics action research II: a framework for integrating HF into work system design.
Neumann, W P; Village, J
2012-01-01
This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.
Thinking graphically: Connecting vision and cognition during graph comprehension.
Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A
2008-03-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved
A science of integration: frameworks, processes, and products in a place-based, integrative study
Kliskey, Andrew; Alessa, Lilian; Wandersee, Sarah; Williams, Paula; Trammell, Jamie; Powell, Jim; Grunblatt, Jess; Wipfli, Mark S.
2017-01-01
Integrative research is increasingly a priority within the scientific community and is a central goal for the evolving field of sustainability science. While it is conceptually attractive, its successful implementation has been challenging and recent work suggests that the move towards interdisciplinarity and transdisciplinarity in sustainability science is being only partially realized. To address this from the perspective of social-ecological systems (SES) research, we examine the process of conducting a science of integration within the Southcentral Alaska Test Case (SCTC) of Alaska-EPSCoR as a test-bed for this approach. The SCTC is part of a large, 5 year, interdisciplinary study investigating changing environments and adaptations to those changes in Alaska. In this paper, we review progress toward a science of integration and present our efforts to confront the practical issues of applying proposed integration frameworks. We: (1) define our integration framework; (2) describe the collaborative processes, including the co-development of science through stakeholder engagement and partnerships; and (3) illustrate potential products of integrative, social-ecological systems research. The approaches we use can also be applied outside of this particular framework. We highlight challenges and propose improvements for integration in sustainability science by addressing the need for common frameworks and improved contextual understanding. These insights may be useful for capacity-building for interdisciplinary projects that address complex real-world social and environmental problems.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.
1992-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
Managing clinical integration in integrated delivery systems: a framework for action.
Young, D W; Barrett, D
1997-01-01
An integrated delivery system (IDS) in healthcare must coordinate patient care across multiple functions, activities, and operating units. To achieve this clinical integration, senior management confronts many challenges. This paper uses a cross-functional-process (CFP) framework to discuss these challenges. There are ten CFPs that fall into three categories: planning processes (strategy formulation, program adaptation, budget formulation), organizational processes (authority and influence, client management, conflict resolution, motivation, and cultural maintenance), and measurement and reporting processes (financial and programmatic). Each process typically spans several functional units. Senior management must consider how to improve both the functioning of each CFP, as well as its "fit" with the other nine. The result can be greater clinical integration, improved cost management, and more coordinated care for enrollees.
Team table: a framework and tool for continuous factory planning
NASA Astrophysics Data System (ADS)
Sihn, Wilfried; Bischoff, Juergen; von Briel, Ralf; Josten, Marcus
2000-10-01
Growing market turbulences and shorter product life cycles require a continuous adaptation of factory structures resulting in a continuous factory planning process. Therefore a new framework is developed which focuses on configuration and data management process integration. This enable an online system performance evaluation based on continuous availability of current data. The use of this framework is especially helpful and will guarantee high cost and time savings, when used in the early stages of the planning, called the concept or rough planning phase. The new framework is supported by a planning round table as a tool for team-based configuration processes integrating the knowledge of all persons involved in planning processes. A case study conducted at a German company shows the advantages which can be achieved by implementing the new framework and methods.
An overview of the model integration process: From pre ...
Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for building integrated modeling systems. We identified five different phases to characterize integration process: pre-integration assessment, preparation of models for integration, orchestration of models during simulation, data interoperability, and testing. Commonly, there is little reuse of existing frameworks beyond the development teams and not much sharing of science components across frameworks. We believe this must change to enable researchers and assessors to form complex workflows that leverage the current environmental science available. In this paper, we characterize the model integration process and compare integration practices of different groups. We highlight key strategies, features, standards, and practices that can be employed by developers to increase reuse and interoperability of science software components and systems. The paper provides a review of the literature regarding techniques and methods employed by various modeling system developers to facilitate science software interoperability. The intent of the paper is to illustrate the wide variation in methods and the limiting effect the variation has on inter-framework reuse and interoperability. A series of recommendation
Valentijn, Pim P; Biermann, Claus; Bruijnzeels, Marc A
2016-08-02
Integrated care services are considered a vital strategy for improving the Triple Aim values for people with chronic kidney disease. However, a solid scholarly explanation of how to develop, implement and evaluate such value-based integrated renal care services is limited. The aim of this study was to develop a framework to identify the strategies and outcomes for the implementation of value-based integrated renal care. First, the theoretical foundations of the Rainbow Model of Integrated Care and the Triple Aim were united into one overarching framework through an iterative process of key-informant consultations. Second, a rapid review approach was conducted to identify the published research on integrated renal care, and the Cochrane Library, Medline, Scopus, and Business Source Premier databases were searched for pertinent articles published between 2000 and 2015. Based on the framework, a coding schema was developed to synthesis the included articles. The overarching framework distinguishes the integrated care domains: 1) type of integration, 2) enablers of integration and the interrelated outcome domains, 3) experience of care, 4) population health and 5) costs. The literature synthesis indicated that integrated renal care implementation strategies have particularly focused on micro clinical processes and physical outcomes, while little emphasis has been placed on meso organisational as well as macro system integration processes. In addition, evidence regarding patients' perceived outcomes and economic outcomes has been weak. These results underscore that the future challenge for researchers is to explore which integrated care implementation strategies achieve better health and improved experience of care at a lower cost within a specific context. For this purpose, this study's framework and evidence synthesis have set a developmental agenda for both integrated renal care practice and research. Accordingly, we plan further work to develop an implementation model for value-based integrated renal services.
Valentijn, Pim P; Bruijnzeels, Marc A; de Leeuw, Rob J; Schrijvers, Guus J.P
2012-01-01
Purpose Capacity problems and political pressures have led to a rapid change in the organization of primary care from mono disciplinary small business to complex inter-organizational relationships. It is assumed that inter-organizational collaboration is the driving force to achieve integrated (primary) care. Despite the importance of collaboration and integration of services in primary care, there is no unambiguous definition for both concepts. The purpose of this study is to examine and link the conceptualisation and validation of the terms inter-organizational collaboration and integrated primary care using a theoretical framework. Theory The theoretical framework is based on the complex collaboration process of negotiation among multiple stakeholder groups in primary care. Methods A literature review of health sciences and business databases, and targeted grey literature sources. Based on the literature review we operationalized the constructs of inter-organizational collaboration and integrated primary care in a theoretical framework. The framework is being validated in an explorative study of 80 primary care projects in the Netherlands. Results and conclusions Integrated primary care is considered as a multidimensional construct based on a continuum of integration, extending from segregation to integration. The synthesis of the current theories and concepts of inter-organizational collaboration is insufficient to deal with the complexity of collaborative issues in primary care. One coherent and integrated theoretical framework was found that could make the complex collaboration process in primary care transparent. This study presented theoretical framework is a first step to understand the patterns of successful collaboration and integration in primary care services. These patterns can give insights in the organization forms needed to create a good working integrated (primary) care system that fits the local needs of a population. Preliminary data of the patterns of collaboration and integration will be presented.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.
NASA Astrophysics Data System (ADS)
Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur
In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.
Integrating financial and strategic planning.
Pivnicny, V C
1989-09-01
As hospitals face mounting profitability and liquidity concerns, the need to integrate strategic and financial planning also will continue to grow. This article describes a process for integrating these planning functions and the ideal organizational framework to facilitate the process. Obstacles to the integration of these planning processes also are discussed.
Advanced process control framework initiative
NASA Astrophysics Data System (ADS)
Hill, Tom; Nettles, Steve
1997-01-01
The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.
Design of a framework for modeling, integration and simulation of physiological models.
Erson, E Zeynep; Cavuşoğlu, M Cenk
2012-09-01
Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Phan, Huy Phuong
2010-01-01
The main aim of this study is to test a conceptualised framework that involved the integration of achievement goals, self-efficacy and self-esteem beliefs, and study-processing strategies. Two hundred and ninety (178 females, 112 males) first-year university students were administered a number of Likert-scale inventories in tutorial classes. Data…
Yang, Yong
2016-09-01
Recently, research on utilitarian walking has gained momentum due to its benefits on both health and the environment. However, our overall understanding of how built and social environments affect travel mode choice (walking or not) is still limited, and most existing frameworks on travel mode choice lack dynamic processes. After a review of several mainstream theories and a number of frameworks, we propose an integrated framework. The basic constructs in the travel mode choice function are utilities, constraints, attitudes, and habits. With a hierarchical structure and heuristic rules, the travel mode choice function is modified by individual characteristics and travel characteristics. The framework explicitly presents several dynamic processes, including the perception process on the environment, attitude formation process, habit formation process, interactions among an individual's own behaviors, interactions among travelers, feedback from travel to the built and social environments, and feedback from other behaviors to the built and social environments. For utilitarian walking, the framework may contribute to the study design, data collection, adoption of new research methods, and provide indications for policy interventions.
Yang, Yong
2016-01-01
Recently, research on utilitarian walking has gained momentum due to its benefits on both health and the environment. However, our overall understanding of how built and social environments affect travel mode choice (walking or not) is still limited, and most existing frameworks on travel mode choice lack dynamic processes. After a review of several mainstream theories and a number of frameworks, we propose an integrated framework. The basic constructs in the travel mode choice function are utilities, constraints, attitudes, and habits. With a hierarchical structure and heuristic rules, the travel mode choice function is modified by individual characteristics and travel characteristics. The framework explicitly presents several dynamic processes, including the perception process on the environment, attitude formation process, habit formation process, interactions among an individual’s own behaviors, interactions among travelers, feedback from travel to the built and social environments, and feedback from other behaviors to the built and social environments. For utilitarian walking, the framework may contribute to the study design, data collection, adoption of new research methods, and provide indications for policy interventions. PMID:27747158
A Web-Based Monitoring System for Multidisciplinary Design Projects
NASA Technical Reports Server (NTRS)
Rogers, James L.; Salas, Andrea O.; Weston, Robert P.
1998-01-01
In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary computational environments, is defined as a hardware and software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, integrated with an existing framework, can improve these areas of weakness. This paper describes a Web-based system that optimizes and controls the execution sequence of design processes; and monitors the project status and results. The three-stage evolution of the system with increasingly complex problems demonstrates the feasibility of this approach.
Toward a More Flexible Web-Based Framework for Multidisciplinary Design
NASA Technical Reports Server (NTRS)
Rogers, J. L.; Salas, A. O.
1999-01-01
In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary design, is defined as a hardware-software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, monitoring, controlling, and displaying the design process. The objective of this research is to explore how Web technology can improve these areas of weakness and lead toward a more flexible framework. This article describes a Web-based system that optimizes and controls the execution sequence of design processes in addition to monitoring the project status and displaying the design results.
Moral judgment as information processing: an integrative review.
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.
Moral judgment as information processing: an integrative review
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022
Examination of the consumer decision process for residential energy use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinan, T.M.
1987-01-01
Numerous studies have examined the factors that influence consumers' energy-using behavior. A comprehensive review of these studies was conducted in which articles from different research disciplines (economics, sociology, psychology, and marketing) were examined. This paper provides a discussion of a subset of these studies, and based on findings of the review, offers recommendations for future research. The literature review revealed a need to develop an integrated framework for examining consumers' energy-using behavior. This integrated framework should simultaneously consider both price and nonprice related factors which underlie energy use decisions. It should also examined the process by which decisions are made,more » as well as the factors that affect the decision outcome. This paper provides a suggested integrated framework for future research and discusses the data required to support this framework. 23 references, 3 figures.« less
Corvin, Jaime A; DeBate, Rita; Wolfe-Quintero, Kate; Petersen, Donna J
2017-01-01
In the twenty-first century, the dynamics of health and health care are changing, necessitating a commitment to revising traditional public health curricula to better meet present day challenges. This article describes how the College of Public Health at the University of South Florida utilized the Intervention Mapping framework to translate revised core competencies into an integrated, theory-driven core curriculum to meet the training needs of the twenty-first century public health scholar and practitioner. This process resulted in the development of four sequenced courses: History and Systems of Public Health and Population Assessment I delivered in the first semester and Population Assessment II and Translation to Practice delivered in the second semester. While the transformation process, moving from traditional public health core content to an integrated and innovative curriculum, is a challenging and daunting task, Intervention Mapping provides the ideal framework for guiding this process. Intervention mapping walks the curriculum developers from the broad goals and objectives to the finite details of a lesson plan. Throughout this process, critical lessons were learned, including the importance of being open to new ideologies and frameworks and the critical need to involve key-stakeholders in every step of the decision-making process to ensure the sustainability of the resulting integrated and theory-based curriculum. Ultimately, as a stronger curriculum emerged, the developers and instructors themselves were changed, fostering a stronger public health workforce from within.
Towards Integrated Health Technology Assessment for Improving Decision Making in Selected Countries.
Oortwijn, Wija; Determann, Domino; Schiffers, Krijn; Tan, Siok Swan; van der Tuin, Jeroen
2017-09-01
To assess the level of comprehensiveness of health technology assessment (HTA) practices around the globe and to formulate recommendations for enhancing legitimacy and fairness of related decision-making processes. To identify best practices, we developed an evaluation framework consisting of 13 criteria on the basis of the INTEGRATE-HTA model (integrative perspective on assessing health technologies) and the Accountability for Reasonableness framework (deliberative appraisal process). We examined different HTA systems in middle-income countries (Argentina, Brazil, and Thailand) and high-income countries (Australia, Canada, England, France, Germany, Scotland, and South Korea). For this purpose, desk research and structured interviews with relevant key stakeholders (N = 32) in the selected countries were conducted. HTA systems in Canada, England, and Scotland appear relatively well aligned with our framework, followed by Australia, Germany, and France. Argentina and South Korea are at an early stage, whereas Brazil and Thailand are at an intermediate level. Both desk research and interviews revealed that scoping is often not part of the HTA process. In contrast, providing evidence reports for assessment is well established. Indirect and unintended outcomes are increasingly considered, but there is room for improvement. Monitoring and evaluation of the HTA process is not well established across countries. Finally, adopting transparent and robust processes, including stakeholder consultation, takes time. This study presents a framework for assessing the level of comprehensiveness of the HTA process in a country. On the basis of applying the framework, we formulate recommendations on how the HTA community can move toward a more integrated decision-making process using HTA. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
A cognitive perspective on health systems integration: results of a Canadian Delphi study.
Evans, Jenna M; Baker, G Ross; Berta, Whitney; Barnsley, Jan
2014-05-19
Ongoing challenges to healthcare integration point toward the need to move beyond structural and process issues. While we know what needs to be done to achieve integrated care, there is little that informs us as to how. We need to understand how diverse organizations and professionals develop shared knowledge and beliefs - that is, we need to generate knowledge about normative integration. We present a cognitive perspective on integration, based on shared mental model theory, that may enhance our understanding and ability to measure and influence normative integration. The aim of this paper is to validate and improve the Mental Models of Integrated Care (MMIC) Framework, which outlines important knowledge and beliefs whose convergence or divergence across stakeholder groups may influence inter-professional and inter-organizational relations. We used a two-stage web-based modified Delphi process to test the MMIC Framework against expert opinion using a random sample of participants from Canada's National Symposium on Integrated Care. Respondents were asked to rate the framework's clarity, comprehensiveness, usefulness, and importance using seven-point ordinal scales. Spaces for open comments were provided. Descriptive statistics were used to describe the structured responses, while open comments were coded and categorized using thematic analysis. The Kruskall-Wallis test was used to examine cross-group agreement by level of integration experience, current workplace, and current role. In the first round, 90 individuals responded (52% response rate), representing a wide range of professional roles and organization types from across the continuum of care. In the second round, 68 individuals responded (75.6% response rate). The quantitative and qualitative feedback from experts was used to revise the framework. The re-named "Integration Mindsets Framework" consists of a Strategy Mental Model and a Relationships Mental Model, comprising a total of nineteen content areas. The Integration Mindsets Framework draws the attention of researchers and practitioners to how various stakeholders think about and conceptualize integration. A cognitive approach to understanding and measuring normative integration complements dominant cultural approaches and allows for more fine-grained analyses. The framework can be used by managers and leaders to facilitate the interpretation, planning, implementation, management and evaluation of integration initiatives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo
2004-06-01
In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-02-06
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.
A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-01-01
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184
Nkhata, Bimo Abraham; Breen, Charles
2010-02-01
This article discusses how the concept of integrated learning systems provides a useful means of exploring the functional linkages between the governance and management of public protected areas. It presents a conceptual framework of an integrated learning system that explicitly incorporates learning processes in governance and management subsystems. The framework is premised on the assumption that an understanding of an integrated learning system is essential if we are to successfully promote learning across multiple scales as a fundamental component of adaptability in the governance and management of protected areas. The framework is used to illustrate real-world situations that reflect the nature and substance of the linkages between governance and management. Drawing on lessons from North America and Africa, the article demonstrates that the establishment and maintenance of an integrated learning system take place in a complex context which links elements of governance learning and management learning subsystems. The degree to which the two subsystems are coupled influences the performance of an integrated learning system and ultimately adaptability. Such performance is largely determined by how integrated learning processes allow for the systematic testing of societal assumptions (beliefs, values, and public interest) to enable society and protected area agencies to adapt and learn in the face of social and ecological change. It is argued that an integrated perspective provides a potentially useful framework for explaining and improving shared understanding around which the concept of adaptability is structured and implemented.
Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil
2014-01-23
Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome.
Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil
2014-01-01
Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. Conclusions This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome. PMID:24463466
The extraction and integration framework: a two-process account of statistical learning.
Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G
2013-07-01
The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved
A Comparison of Product Realization Frameworks
1993-10-01
software (integrated FrameMaker ). Also included are BOLD for on-line documentation delivery, printer/plotter support, and 18 network licensing support. AMPLE...are built with DSS. Documentation tools include an on-line information system (BOLD), text editing (Notepad), word processing (integrated FrameMaker ...within an application. FrameMaker is fully integrated with the Falcon Framework to provide consistent documentation capabilities within engineering
Radin Umar, Radin Zaid; Sommerich, Carolyn M; Lavender, Steve A; Sanders, Elizabeth; Evans, Kevin D
2018-05-14
Sound workplace ergonomics and safety-related interventions may be resisted by employees, and this may be detrimental to multiple stakeholders. Understanding fundamental aspects of decision making, behavioral change, and learning cycles may provide insights into pathways influencing employees' acceptance of interventions. This manuscript reviews published literature on thinking processes and other topics relevant to decision making and incorporates the findings into two new conceptual frameworks of the workplace change adoption process. Such frameworks are useful for thinking about adoption in different ways and testing changes to traditional intervention implementation processes. Moving forward, it is recommended that future research focuses on systematic exploration of implementation process activities that integrate principles from the research literature on sensemaking, decision making, and learning processes. Such exploration may provide the groundwork for development of specific implementation strategies that are theoretically grounded and provide a revised understanding of how successful intervention adoption processes work.
An automated and integrated framework for dust storm detection based on ogc web processing services
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.
2014-11-01
Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.
Climbing the ladder: capability maturity model integration level 3
NASA Astrophysics Data System (ADS)
Day, Bryce; Lutteroth, Christof
2011-02-01
This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.
Thinking Graphically: Connecting Vision and Cognition during Graph Comprehension
ERIC Educational Resources Information Center
Ratwani, Raj M.; Trafton, J. Gregory; Boehm-Davis, Deborah A.
2008-01-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive…
Language repetition and short-term memory: an integrative framework.
Majerus, Steve
2013-01-01
Short-term maintenance of verbal information is a core factor of language repetition, especially when reproducing multiple or unfamiliar stimuli. Many models of language processing locate the verbal short-term maintenance function in the left posterior superior temporo-parietal area and its connections with the inferior frontal gyrus. However, research in the field of short-term memory has implicated bilateral fronto-parietal networks, involved in attention and serial order processing, as being critical for the maintenance and reproduction of verbal sequences. We present here an integrative framework aimed at bridging research in the language processing and short-term memory fields. This framework considers verbal short-term maintenance as an emergent function resulting from synchronized and integrated activation in dorsal and ventral language processing networks as well as fronto-parietal attention and serial order processing networks. To-be-maintained item representations are temporarily activated in the dorsal and ventral language processing networks, novel phoneme and word serial order information is proposed to be maintained via a right fronto-parietal serial order processing network, and activation in these different networks is proposed to be coordinated and maintained via a left fronto-parietal attention processing network. This framework provides new perspectives for our understanding of information maintenance at the non-word-, word- and sentence-level as well as of verbal maintenance deficits in case of brain injury.
Language repetition and short-term memory: an integrative framework
Majerus, Steve
2013-01-01
Short-term maintenance of verbal information is a core factor of language repetition, especially when reproducing multiple or unfamiliar stimuli. Many models of language processing locate the verbal short-term maintenance function in the left posterior superior temporo-parietal area and its connections with the inferior frontal gyrus. However, research in the field of short-term memory has implicated bilateral fronto-parietal networks, involved in attention and serial order processing, as being critical for the maintenance and reproduction of verbal sequences. We present here an integrative framework aimed at bridging research in the language processing and short-term memory fields. This framework considers verbal short-term maintenance as an emergent function resulting from synchronized and integrated activation in dorsal and ventral language processing networks as well as fronto-parietal attention and serial order processing networks. To-be-maintained item representations are temporarily activated in the dorsal and ventral language processing networks, novel phoneme and word serial order information is proposed to be maintained via a right fronto-parietal serial order processing network, and activation in these different networks is proposed to be coordinated and maintained via a left fronto-parietal attention processing network. This framework provides new perspectives for our understanding of information maintenance at the non-word-, word- and sentence-level as well as of verbal maintenance deficits in case of brain injury. PMID:23874280
Corvin, Jaime A.; DeBate, Rita; Wolfe-Quintero, Kate; Petersen, Donna J.
2017-01-01
In the twenty-first century, the dynamics of health and health care are changing, necessitating a commitment to revising traditional public health curricula to better meet present day challenges. This article describes how the College of Public Health at the University of South Florida utilized the Intervention Mapping framework to translate revised core competencies into an integrated, theory-driven core curriculum to meet the training needs of the twenty-first century public health scholar and practitioner. This process resulted in the development of four sequenced courses: History and Systems of Public Health and Population Assessment I delivered in the first semester and Population Assessment II and Translation to Practice delivered in the second semester. While the transformation process, moving from traditional public health core content to an integrated and innovative curriculum, is a challenging and daunting task, Intervention Mapping provides the ideal framework for guiding this process. Intervention mapping walks the curriculum developers from the broad goals and objectives to the finite details of a lesson plan. Throughout this process, critical lessons were learned, including the importance of being open to new ideologies and frameworks and the critical need to involve key-stakeholders in every step of the decision-making process to ensure the sustainability of the resulting integrated and theory-based curriculum. Ultimately, as a stronger curriculum emerged, the developers and instructors themselves were changed, fostering a stronger public health workforce from within. PMID:29164094
Distributed software framework and continuous integration in hydroinformatics systems
NASA Astrophysics Data System (ADS)
Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao
2017-08-01
When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.
Valentijn, Pim P.; Schepman, Sanneke M.; Opheij, Wilfrid; Bruijnzeels, Marc A.
2013-01-01
Introduction Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. Methods The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. Results The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. Discussion The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective. PMID:23687482
Valentijn, Pim P; Schepman, Sanneke M; Opheij, Wilfrid; Bruijnzeels, Marc A
2013-01-01
Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective.
Smolensky, Paul; Goldrick, Matthew; Mathis, Donald
2014-08-01
Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The framework we introduce here, Gradient Symbol Processing, characterizes the emergence of grammatical macrostructure from the Parallel Distributed Processing microstructure (McClelland, Rumelhart, & The PDP Research Group, 1986) of language processing. The mental representations that emerge, Distributed Symbol Systems, have both combinatorial and gradient structure. They are processed through Subsymbolic Optimization-Quantization, in which an optimization process favoring representations that satisfy well-formedness constraints operates in parallel with a distributed quantization process favoring discrete symbolic structures. We apply a particular instantiation of this framework, λ-Diffusion Theory, to phonological production. Simulations of the resulting model suggest that Gradient Symbol Processing offers a way to unify accounts of grammatical competence with both discrete and continuous patterns in language performance. Copyright © 2013 Cognitive Science Society, Inc.
Competence-Based Approach in Value Chain Processes
NASA Astrophysics Data System (ADS)
Azevedo, Rodrigo Cambiaghi; D'Amours, Sophie; Rönnqvist, Mikael
There is a gap between competence theory and value chain processes frameworks. While individually considered as core elements in contemporary management thinking, the integration of the two concepts is still lacking. We claim that this integration would allow for the development of more robust business models by structuring value chain activities around aspects such as capabilities and skills, as well as individual and organizational knowledge. In this context, the objective of this article is to reduce this gap and consequently open a field for further improvements of value chain processes frameworks.
On the nature of cross-disciplinary integration: A philosophical framework.
O'Rourke, Michael; Crowley, Stephen; Gonnerman, Chad
2016-04-01
Meeting grand challenges requires responses that constructively combine multiple forms of expertise, both academic and non-academic; that is, it requires cross-disciplinary integration. But just what is cross-disciplinary integration? In this paper, we supply a preliminary answer by reviewing prominent accounts of cross-disciplinary integration from two literatures that are rarely brought together: cross-disciplinarity and philosophy of biology. Reflecting on similarities and differences in these accounts, we develop a framework that integrates their insights-integration as a generic combination process the details of which are determined by the specific contexts in which particular integrations occur. One such context is cross-disciplinary research, which yields cross-disciplinary integration. We close by reflecting on the potential applicability of this framework to research efforts aimed at meeting grand challenges. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Shuai; Chen, Ge; Yao, Shifeng; Tian, Fenglin; Liu, Wei
2017-07-01
This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.
Improved analyses using function datasets and statistical modeling
John S. Hogland; Nathaniel M. Anderson
2014-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...
ERIC Educational Resources Information Center
Charman, Steve D.; Carlucci, Marianna; Vallano, Jon; Gregory, Amy Hyman
2010-01-01
The current manuscript proposes a theory of how witnesses assess their confidence following a lineup identification, called the selective cue integration framework (SCIF). Drawing from past research on the postidentification feedback effect, the SCIF details a three-stage process of confidence assessment that is based largely on a…
eClims: An Extensible and Dynamic Integration Framework for Biomedical Information Systems.
Savonnet, Marinette; Leclercq, Eric; Naubourg, Pierre
2016-11-01
Biomedical information systems (BIS) require consideration of three types of variability: data variability induced by new high throughput technologies, schema or model variability induced by large scale studies or new fields of research, and knowledge variability resulting from new discoveries. Beyond data heterogeneity, managing variabilities in the context of BIS requires extensible and dynamic integration process. In this paper, we focus on data and schema variabilities and we propose an integration framework based on ontologies, master data, and semantic annotations. The framework addresses issues related to: 1) collaborative work through a dynamic integration process; 2) variability among studies using an annotation mechanism; and 3) quality control over data and semantic annotations. Our approach relies on two levels of knowledge: BIS-related knowledge is modeled using an application ontology coupled with UML models that allow controlling data completeness and consistency, and domain knowledge is described by a domain ontology, which ensures data coherence. A system build with the eClims framework has been implemented and evaluated in the context of a proteomic platform.
Dictionary-based image reconstruction for superresolution in integrated circuit imaging.
Cilingiroglu, T Berkin; Uyar, Aydan; Tuysuzoglu, Ahmet; Karl, W Clem; Konrad, Janusz; Goldberg, Bennett B; Ünlü, M Selim
2015-06-01
Resolution improvement through signal processing techniques for integrated circuit imaging is becoming more crucial as the rapid decrease in integrated circuit dimensions continues. Although there is a significant effort to push the limits of optical resolution for backside fault analysis through the use of solid immersion lenses, higher order laser beams, and beam apodization, signal processing techniques are required for additional improvement. In this work, we propose a sparse image reconstruction framework which couples overcomplete dictionary-based representation with a physics-based forward model to improve resolution and localization accuracy in high numerical aperture confocal microscopy systems for backside optical integrated circuit analysis. The effectiveness of the framework is demonstrated on experimental data.
Strasser, T; Peters, T; Jagle, H; Zrenner, E; Wilke, R
2010-01-01
Electrophysiology of vision - especially the electroretinogram (ERG) - is used as a non-invasive way for functional testing of the visual system. The ERG is a combined electrical response generated by neural and non-neuronal cells in the retina in response to light stimulation. This response can be recorded and used for diagnosis of numerous disorders. For both clinical practice and clinical trials it is important to process those signals in an accurate and fast way and to provide the results as structured, consistent reports. Therefore, we developed a freely available and open-source framework in Java (http://www.eye.uni-tuebingen.de/project/idsI4sigproc). The framework is focused on an easy integration with existing applications. By leveraging well-established software patterns like pipes-and-filters and fluent interfaces as well as by designing the application programming interfaces (API) as an integrated domain specific language (DSL) the overall framework provides a smooth learning curve. Additionally, it already contains several processing methods and visualization features and can be extended easily by implementing the provided interfaces. In this way, not only can new processing methods be added but the framework can also be adopted for other areas of signal processing. This article describes in detail the structure and implementation of the framework and demonstrate its application through the software package used in clinical practice and clinical trials at the University Eye Hospital Tuebingen one of the largest departments in the field of visual electrophysiology in Europe.
Integration of hybrid wireless networks in cloud services oriented enterprise information systems
NASA Astrophysics Data System (ADS)
Li, Shancang; Xu, Lida; Wang, Xinheng; Wang, Jue
2012-05-01
This article presents a hybrid wireless network integration scheme in cloud services-based enterprise information systems (EISs). With the emerging hybrid wireless networks and cloud computing technologies, it is necessary to develop a scheme that can seamlessly integrate these new technologies into existing EISs. By combining the hybrid wireless networks and computing in EIS, a new framework is proposed, which includes frontend layer, middle layer and backend layers connected to IP EISs. Based on a collaborative architecture, cloud services management framework and process diagram are presented. As a key feature, the proposed approach integrates access control functionalities within the hybrid framework that provide users with filtered views on available cloud services based on cloud service access requirements and user security credentials. In future work, we will implement the proposed framework over SwanMesh platform by integrating the UPnP standard into an enterprise information system.
Issues on Building Kazakhstan Geospatial Portal to Implement E-Government
NASA Astrophysics Data System (ADS)
Sagadiyev, K.; Kang, H. K.; Li, K. J.
2016-06-01
A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.
Adaptive Sensing and Fusion of Multi-Sensor Data and Historical Information
2009-11-06
integrate MTL and semi-supervised learning into a single framework , thereby exploiting two forms of contextual information. A key new objective of the...this report we integrate MTL and semi-supervised learning into a single framework , thereby exploiting two forms of contextual information. A key new...process [8], denoted as X ∼ BeP (B), where B is a measure on Ω. If B is continuous, X is a Poisson process with intensity B and can be constructed as X = N
Self-Referent Constructs and Medical Sociology: In Search of an Integrative Framework*
Kaplan, Howard B.
2010-01-01
A theoretical framework centering on four classes of self-referent constructs is offered as a device for integrating the diverse areas constituting medical sociology. Guidance by this framework sensitizes the researcher to the occurrence of parallel processes in adjacent disciplines, facilitates recognition of the etiological significance of findings from other disciplines for explaining medical sociological phenomena, and encourages transactions between sociology and medical sociology whereby each informs and is informed by the other. PMID:17583268
Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark
2017-12-01
The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.
Symbolic and Interactional Perspectives on Leadership: An Integrative Framework.
1985-05-01
RD-RI55 24? SYMBOLIC AND INTERACTIONAL PERSPECTIVES ON LEADERSHIP: 1/1 AN INTEGRATIVE FRA..(U) TEXAS A AND M UNIV COLLEGE STATION DEPT OF MANAGEMENT...Processing Systems Office of Naval Research Technical Report Series Symbolic and Interactional 11% Perspectives on Leadership: An Integrative Framework...Richard Daft -~ and Ricky Griffin CAs * Principal Investigators IThi. dmmu asbom apro 1W ~ ~ 1W ~ w 4 d a% f dkbsa Symbolic and Interactional Perspectives
Integration of a three-dimensional process-based hydrological model into the Object Modeling System
USDA-ARS?s Scientific Manuscript database
The integration of a spatial process model into an environmental modelling framework can enhance the model’s capabilities. We present the integration of the GEOtop model into the Object Modeling System (OMS) version 3.0 and illustrate its application in a small watershed. GEOtop is a physically base...
ERIC Educational Resources Information Center
Lee, Young S.
2014-01-01
The article focuses on a systematic approach to the instructional framework to incorporate three aspects of sustainable design. It also aims to provide an instruction model for sustainable design stressing a collective effort to advance knowledge creation as a community. It develops a framework conjoining the concept of integrated process in…
a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images
NASA Astrophysics Data System (ADS)
Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.
2015-07-01
Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.
Quantifying an Integral Ecology Framework: A Case Study of the Riverina, Australia
NASA Astrophysics Data System (ADS)
Wheeler, Sarah A.; Haensch, Juliane; Edwards, Jane; Schirmer, Jackie; Zuo, Alec
2018-02-01
Communities in Australia's Murray-Darling Basin face the challenge of trying to achieve social, economic, and environmental sustainability; but experience entrenched conflict about the best way to achieve a sustainable future, especially for small rural communities. Integral ecology is a philosophical concept that seeks to address community, economic, social, and environmental sustainability simultaneously. Its inclusive processes are designed to reduce stakeholder conflict. However, to date the application of the integral ecology concept has been largely qualitative in nature. This study developed a quantitative integral ecology framework, and applied this framework to a case study of the Riverina, in the Murray-Darling Basin. Seventy-seven community-focused initiatives were assessed, ranked, and quantified. The majority of the community-focused ranked initiatives did not exhibit all aspects of integral ecology. Initiatives typically prioritized either (1) economic and community development or (2) environmental health; rarely both together. The integral ecology framework developed here enables recommendations on future community initiatives and may provide a pathway for community leaders and other policy-makers to more readily apply integral ecology objectives. Further research refining the framework's operationalization, application and implementation to a wider-scale may enhance communities' capacity to develop and grow sustainably.
Steps toward improving ethical evaluation in health technology assessment: a proposed framework.
Assasi, Nazila; Tarride, Jean-Eric; O'Reilly, Daria; Schwartz, Lisa
2016-06-06
While evaluation of ethical aspects in health technology assessment (HTA) has gained much attention during the past years, the integration of ethics in HTA practice still presents many challenges. In response to the increasing demand for expansion of health technology assessment (HTA) methodology to include ethical issues more systematically, this article reports on a multi-stage study that aimed at construction of a framework for improving the integration of ethics in HTA. The framework was developed through the following phases: 1) a systematic review and content analysis of guidance documents for ethics in HTA; 2) identification of factors influencing the integration of ethical considerations in HTA; 3) preparation of an action-oriented framework based on the key elements of the existing guidance documents and identified barriers to and facilitators of their implementation; and 4) expert consultation and revision of the framework. The proposed framework consists of three main components: an algorithmic flowchart, which exhibits the different steps of an ethical inquiry throughout the HTA process, including: defining the objectives and scope of the evaluation, stakeholder analysis, assessing organizational capacity, framing ethical evaluation questions, ethical analysis, deliberation, and knowledge translation; a stepwise guide, which focuses on the task objectives and potential questions that are required to be addressed at each step; and a list of some commonly recommended or used tools to help facilitate the evaluation process. The proposed framework can be used to support and promote good practice in integration of ethics into HTA. However, further validation of the framework through case studies and expert consultation is required to establish its utility for HTA practice.
Shared mental models of integrated care: aligning multiple stakeholder perspectives.
Evans, Jenna M; Baker, G Ross
2012-01-01
Health service organizations and professionals are under increasing pressure to work together to deliver integrated patient care. A common understanding of integration strategies may facilitate the delivery of integrated care across inter-organizational and inter-professional boundaries. This paper aims to build a framework for exploring and potentially aligning multiple stakeholder perspectives of systems integration. The authors draw from the literature on shared mental models, strategic management and change, framing, stakeholder management, and systems theory to develop a new construct, Mental Models of Integrated Care (MMIC), which consists of three types of mental models, i.e. integration-task, system-role, and integration-belief. The MMIC construct encompasses many of the known barriers and enablers to integrating care while also providing a comprehensive, theory-based framework of psychological factors that may influence inter-organizational and inter-professional relations. While the existing literature on integration focuses on optimizing structures and processes, the MMIC construct emphasizes the convergence and divergence of stakeholders' knowledge and beliefs, and how these underlying cognitions influence interactions (or lack thereof) across the continuum of care. MMIC may help to: explain what differentiates effective from ineffective integration initiatives; determine system readiness to integrate; diagnose integration problems; and develop interventions for enhancing integrative processes and ultimately the delivery of integrated care. Global interest and ongoing challenges in integrating care underline the need for research on the mental models that characterize the behaviors of actors within health systems; the proposed framework offers a starting point for applying a cognitive perspective to health systems integration.
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; ...
2013-01-01
Background . The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective . To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods . The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expertmore » knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results . The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions . Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Varnum, Susan M.; Brown, Joseph N.; Riensche, Roderick M.; Adkins, Joshua N.; Jacobs, Jon M.; Hoidal, John R.; Scholand, Mary Beth; Pounds, Joel G.; Blackburn, Michael R.; Rodland, Karin D.; McDermott, Jason E.
2013-01-01
Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification. PMID:24223463
ERIC Educational Resources Information Center
Mano, Quintino R.
2016-01-01
Accumulating evidence suggests that literacy acquisition involves developing sensitivity to the statistical regularities of the textual environment. To organize accumulating evidence and help guide future inquiry, this article integrates data from disparate fields of study and formalizes a new two-process framework for developing sensitivity to…
NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.
ERIC Educational Resources Information Center
Zhou, Lina; Zhang, Dongsong
2003-01-01
Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…
Organizational Context and Capabilities for Integrating Care: A Framework for Improvement.
Evans, Jenna M; Grudniewicz, Agnes; Baker, G Ross; Wodchis, Walter P
2016-08-31
Interventions aimed at integrating care have become widespread in healthcare; however, there is significant variability in their success. Differences in organizational contexts and associated capabilities may be responsible for some of this variability. This study develops and validates a conceptual framework of organizational capabilities for integrating care, identifies which of these capabilities may be most important, and explores the mechanisms by which they influence integrated care efforts. The Context and Capabilities for Integrating Care (CCIC) Framework was developed through a literature review, and revised and validated through interviews with leaders and care providers engaged in integrated care networks in Ontario, Canada. Interviews involved open-ended questions and graphic elicitation. Quantitative content analysis was used to summarize the data. The CCIC Framework consists of eighteen organizational factors in three categories: Basic Structures, People and Values, and Key Processes. The three most important capabilities shaping the capacity of organizations to implement integrated care interventions include Leadership Approach, Clinician Engagement and Leadership, and Readiness for Change. The majority of hypothesized relationships among organizational capabilities involved Readiness for Change and Partnering, emphasizing the complexity, interrelatedness and importance of these two factors to integrated care efforts. Organizational leaders can use the framework to determine readiness to integrate care, develop targeted change management strategies, and select appropriate partners with overlapping or complementary profiles on key capabilities. Researchers may use the results to test and refine the proposed framework, with a focus on the hypothesized relationships among organizational capabilities and between organizational capabilities and performance outcomes.
NASA Astrophysics Data System (ADS)
Wi, S.; Freeman, S.; Brown, C.
2017-12-01
This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
Decision making and coping in healthcare: the Coping in Deliberation (CODE) framework.
Witt, Jana; Elwyn, Glyn; Wood, Fiona; Brain, Kate
2012-08-01
To develop a framework of decision making and coping in healthcare that describes the twin processes of appraisal and coping faced by patients making preference-sensitive healthcare decisions. We briefly review the literature for decision making theories and coping theories applicable to preference-sensitive decisions in healthcare settings. We describe first decision making, then coping and finally attempt to integrate these processes by building on current theory. Deliberation in healthcare may be described as a six step process, comprised of the presentation of a health threat, choice, options, preference construction, the decision itself and consolidation post-decision. Coping can be depicted in three stages, beginning with a threat, followed by primary and secondary appraisal and ultimately resulting in a coping effort. Drawing together concepts from prominent decision making theories and coping theories, we propose a multidimensional, interactive framework which integrates both processes and describes coping in deliberation. The proposed framework offers an insight into the complexity of decision making in preference-sensitive healthcare contexts from a patient perspective and may act as theoretical basis for decision support. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Van Dijk-de Vries, Anneke N; Duimel-Peeters, Inge G P; Muris, Jean W; Wesseling, Geertjan J; Beusmans, George H M I; Vrijhoef, Hubertus J M
2016-04-08
Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach's alpha. The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach's alpha between 0.76 and 0.81). The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument.
NASA Astrophysics Data System (ADS)
Benkrid, K.; Belkacemi, S.; Sukhsawas, S.
2005-06-01
This paper proposes an integrated framework for the high level design of high performance signal processing algorithms' implementations on FPGAs. The framework emerged from a constant need to rapidly implement increasingly complicated algorithms on FPGAs while maintaining the high performance needed in many real time digital signal processing applications. This is particularly important for application developers who often rely on iterative and interactive development methodologies. The central idea behind the proposed framework is to dynamically integrate high performance structural hardware description languages with higher level hardware languages in other to help satisfy the dual requirement of high level design and high performance implementation. The paper illustrates this by integrating two environments: Celoxica's Handel-C language, and HIDE, a structural hardware environment developed at the Queen's University of Belfast. On the one hand, Handel-C has been proven to be very useful in the rapid design and prototyping of FPGA circuits, especially control intensive ones. On the other hand, HIDE, has been used extensively, and successfully, in the generation of highly optimised parameterisable FPGA cores. In this paper, this is illustrated in the construction of a scalable and fully parameterisable core for image algebra's five core neighbourhood operations, where fully floorplanned efficient FPGA configurations, in the form of EDIF netlists, are generated automatically for instances of the core. In the proposed combined framework, highly optimised data paths are invoked dynamically from within Handel-C, and are synthesized using HIDE. Although the idea might seem simple prima facie, it could have serious implications on the design of future generations of hardware description languages.
Metrics and Mappings: A Framework for Understanding Real-World Quantitative Estimation.
ERIC Educational Resources Information Center
Brown, Norman R.; Siegler, Robert S.
1993-01-01
A metrics and mapping framework is proposed to account for how heuristics, domain-specific reasoning, and intuitive statistical induction processes are integrated to generate estimates. Results of 4 experiments involving 188 undergraduates illustrate framework usefulness and suggest when people use heuristics and when they emphasize…
MacNamara, Aine; Collins, Dave
2014-01-01
Gulbin and colleagues (Gulbin, J. P., Croser, M. J., Morley, E. J., & Weissensteiner, J. R. (2013). An integrated framework for the optimisation of sport and athlete development: A practitioner approach. Journal of Sports Sciences) present a new sport and athlete development framework that evolved from empirical observations from working with the Australian Institute of Sport. The FTEM (Foundations, Talent, Elite, Mastery) framework is proposed to integrate general and specialised phases of development for participants within the active lifestyle, sport participation and sport excellence pathways. A number of issues concerning the FTEM framework are presented. We also propose the need to move beyond prescriptive models of talent identification and development towards a consideration of features of best practice and process markers of development together with robust guidelines about the implementation of these in applied practice.
Six sigma tools in integrating internal operations of a retail pharmacy: a case study.
Kumar, Sameer; Kwong, Anthony M
2011-01-01
This study was initiated to integrate information and enterprise-wide healthcare delivery system issues specifically within an inpatient retail pharmacy operation in a U.S. community hospital. Six Sigma tools were used to examine the effects to an inpatient retail pharmacy service process. Some of the tools used include service blueprints, cause-effect diagram, gap analysis derived from customer and employee surveys, mistake proofing was applied in various business situations and results were analyzed to identify and propose process improvements and integration. The research indicates that the Six Sigma tools in this discussion are very applicable and quite effective in helping to streamline and integrate the pharmacy process flow. Additionally, gap analysis derived from two different surveys was used to estimate the primary areas of focus to increase customer and employee satisfaction. The results of this analysis were useful in initiating discussions of how to effectively narrow these service gaps. This retail pharmaceutical service study serves as a framework for the process that should occur for successful process improvement tool evaluation and implementation. Pharmaceutical Service operations in the U.S. that use this integration framework must tailor it to their individual situations to maximize their chances for success.
Organizational Context and Capabilities for Integrating Care: A Framework for Improvement
Grudniewicz, Agnes; Baker, G. Ross; Wodchis, Walter P.
2016-01-01
Background: Interventions aimed at integrating care have become widespread in healthcare; however, there is significant variability in their success. Differences in organizational contexts and associated capabilities may be responsible for some of this variability. Purpose: This study develops and validates a conceptual framework of organizational capabilities for integrating care, identifies which of these capabilities may be most important, and explores the mechanisms by which they influence integrated care efforts. Methods: The Context and Capabilities for Integrating Care (CCIC) Framework was developed through a literature review, and revised and validated through interviews with leaders and care providers engaged in integrated care networks in Ontario, Canada. Interviews involved open-ended questions and graphic elicitation. Quantitative content analysis was used to summarize the data. Results: The CCIC Framework consists of eighteen organizational factors in three categories: Basic Structures, People and Values, and Key Processes. The three most important capabilities shaping the capacity of organizations to implement integrated care interventions include Leadership Approach, Clinician Engagement and Leadership, and Readiness for Change. The majority of hypothesized relationships among organizational capabilities involved Readiness for Change and Partnering, emphasizing the complexity, interrelatedness and importance of these two factors to integrated care efforts. Conclusions: Organizational leaders can use the framework to determine readiness to integrate care, develop targeted change management strategies, and select appropriate partners with overlapping or complementary profiles on key capabilities. Researchers may use the results to test and refine the proposed framework, with a focus on the hypothesized relationships among organizational capabilities and between organizational capabilities and performance outcomes. PMID:28413366
Marketing and Languages: An Integrative Model.
ERIC Educational Resources Information Center
McCall, Ian
1988-01-01
A framework is proposed for an integrated course in which knowledge of a language is consciously related to the processes of interpersonal communication and the cultural aspects of marketing and negotiation. (Editor)
NASA Astrophysics Data System (ADS)
Shen, Ji; Sung, Shannon; Zhang, Dongmei
2015-11-01
Students need to think and work across disciplinary boundaries in the twenty-first century. However, it is unclear what interdisciplinary thinking means and how to analyze interdisciplinary interactions in teamwork. In this paper, drawing on multiple theoretical perspectives and empirical analysis of discourse contents, we formulate a theoretical framework that helps analyze interdisciplinary reasoning and communication (IRC) processes in interdisciplinary collaboration. Specifically, we propose four interrelated IRC processes-integration, translation, transfer, and transformation, and develop a corresponding analytic framework. We apply the framework to analyze two meetings of a project that aims to develop interdisciplinary science assessment items. The results illustrate that the framework can help interpret the interdisciplinary meeting dynamics and patterns. Our coding process and results also suggest that these IRC processes can be further examined in terms of interconnected sub-processes. We also discuss the implications of using the framework in conceptualizing, practicing, and researching interdisciplinary learning and teaching in science education.
Using Learning Analytics for Preserving Academic Integrity
ERIC Educational Resources Information Center
Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena
2017-01-01
This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…
Modelling Biogeochemistry Across Domains with The Modular System for Shelves and Coasts (MOSSCO)
NASA Astrophysics Data System (ADS)
Burchard, H.; Lemmen, C.; Hofmeister, R.; Knut, K.; Nasermoaddeli, M. H.; Kerimoglu, O.; Koesters, F.; Wirtz, K.
2016-02-01
Coastal biogeochemical processes extend from the atmosphere through the water column and the epibenthos into the ocean floor, laterally they are determined by freshwater inflows and open water exchange, and in situ they are mediated by physical, chemical and biological interactions. We use the new Modular System for Shelves and Coasts (MOSSCO, http://www.mossco.de) to obtain an integrated view of coastal biogeochemistry. MOSSCO is a coupling framework that builds on existing coupling technologies like the Earth System Modeling Framework (ESMF, for domain-coupling) and the Framework for Aquatic Biogeochemistry (FABM, for process coupling). MOSSCO facilitates the communication about and the integration of existing and of new process models into a threedimensional regional coastal modelling context. In the MOSSCO concept, the integrating framework imposes very few restrictions on contributed data or models; in fact, there is no distinction made between data and models. The few requirements are: (1) principle coupleability, i.e. access to I/O and timing information in submodels, which has recently been referred to as the Basic Model Interface (BMI) (2) open source/open data access and licencing and (3) communication of metadata, such as spatiotemporal information, naming conventions, and physical units. These requirements suffice to integrate different models and data sets into the MOSSCO infrastructure and subsequently built a modular integrated modeling tool that can span a diversity of processes and domains. Here, we demonstrate a MOSSCO application for the southern North Sea, where atmospheric deposition, biochemical processing in the water column and the ocean floor, lateral nutrient replenishment, and wave- and current-dependent remobilization from sediments are accounted for by modular components. A multi-annual simulation yields realistic succession of the spatial gradients of dissolved nutrients, of chlorophyll variability and gross primary production rates and of benthic denitrification rates for this intriguing coastal system.
A spiritual framework in incest survivors treatment.
Beveridge, Kelli; Cheung, Monit
2004-01-01
Through an examination of recent incest treatment development, this article emphasizes the theoretical concept of integration within the treatment process for female adult incest survivors. Spirituality as a therapeutic foundation is discussed with examples of therapeutic techniques. A case study illustrates the psycho-spiritual process of treating a 29-year-old female incest survivor and describes how self-integration has helped this client heal from trauma and change her worldview. Significant outcomes of treatment include the client's gaining of self-awareness and freeing herself from emotional blindness. The recommended practice framework includes a three-step healing process of building alliance with the client in a safe environment, disputing faulty religious assumptions in a learning process, and affirming the needs for reconnection and continuous spiritual support.
Performance measurement integrated information framework in e-Manufacturing
NASA Astrophysics Data System (ADS)
Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José
2014-11-01
The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.
Brody, Janet L; Scherer, David G; Turner, Charles W; Annett, Robert D; Dalen, Jeanne
2017-06-07
Individual and group-based psychotherapeutic interventions increasingly incorporate mindfulness-based principles and practices. These practices include a versatile set of skills such as labeling and attending to present-moment experiences, acting with awareness, and avoiding automatic reactivity. A primary motivation for integrating mindfulness into these therapies is compelling evidence that it enhances emotion regulation. Research also demonstrates that family relationships have a profound influence on emotion regulation capacities, which are central to family functioning and prosocial behavior more broadly. Despite this evidence, no framework exists to describe how mindfulness might integrate into family therapy. This paper describes the benefits of mindfulness-based interventions, highlighting how and why informal mindfulness practices might enhance emotion regulation when integrated with family therapy. We provide a clinical framework for integrating mindfulness into family therapy, particularly as it applies to families with adolescents. A brief case example details sample methods showing how incorporating mindfulness practices into family therapy may enhance treatment outcomes. A range of assessment modalities from biological to behavioral demonstrates the breadth with which the benefits of a family-based mindfulness intervention might be evaluated. © 2017 The Authors. Family Process published by Wiley Periodicals, Inc. on behalf of Family Process Institute.
Working toward integrated models of alpine plant distribution.
Carlson, Bradley Z; Randin, Christophe F; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe
2013-10-01
Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial-temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution.
Van Dijk-de Vries, Anneke N.; Duimel-Peeters, Inge G. P.; Muris, Jean W.; Wesseling, Geertjan J.; Beusmans, George H. M. I.
2016-01-01
Introduction: Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Theory and methods: Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach’s alpha. Results: The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach’s alpha between 0.76 and 0.81). Conclusions and discussion: The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument. PMID:27616953
MAPI: a software framework for distributed biomedical applications
2013-01-01
Background The amount of web-based resources (databases, tools etc.) in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license). PMID:23311574
eXframe: reusable framework for storage, analysis and visualization of genomics experiments
2011-01-01
Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1) the ability to publish structured data compliant with accepted standards 2) support for multiple data types including microarrays and next generation sequencing 3) query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples) and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own useful modifications. PMID:22103807
Overview of NASA MSFC IEC Federated Engineering Collaboration Capability
NASA Technical Reports Server (NTRS)
Moushon, Brian; McDuffee, Patrick
2005-01-01
The MSFC IEC federated engineering framework is currently developing a single collaborative engineering framework across independent NASA centers. The federated approach allows NASA centers the ability to maintain diversity and uniqueness, while providing interoperability. These systems are integrated together in a federated framework without compromising individual center capabilities. MSFC IEC's Federation Framework will have a direct affect on how engineering data is managed across the Agency. The approach is directly attributed in response to the Columbia Accident Investigation Board (CAB) finding F7.4-11 which states the Space Shuttle Program has a wealth of data sucked away in multiple databases without a convenient way to integrate and use the data for management, engineering, or safety decisions. IEC s federated capability is further supported by OneNASA recommendation 6 that identifies the need to enhance cross-Agency collaboration by putting in place common engineering and collaborative tools and databases, processes, and knowledge-sharing structures. MSFC's IEC Federated Framework is loosely connected to other engineering applications that can provide users with the integration needed to achieve an Agency view of the entire product definition and development process, while allowing work to be distributed across NASA Centers and contractors. The IEC DDMS federation framework eliminates the need to develop a single, enterprise-wide data model, where the goal of having a common data model shared between NASA centers and contractors is very difficult to achieve.
Modeling and Advanced Control for Sustainable Process Systems
This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-insp...
2001-12-01
conceptual framework proposed by Craik and Lockhart (1972) offers a useful approach for examining depth of processing in an information integration task... processing ( Craik & Lockhart , 1972) for a particular unit (i.e., whether cueing decreased target sensitivity) (Yeh & Wickens, 2001). It was also predicted...F.I.M. & Lockhart , R.S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11, 671-684
2014-04-30
cost to acquire systems as design maturity could be verified incrementally as the system was developed vice waiting for specific large “ big bang ...Framework (MBAF) be applied to simulate or optimize process variations on programs? LSI Roles and Responsibilities A review of the roles and...the model/process optimization process. It is the current intent that NAVAIR will use the model to run simulations on process changes in an attempt to
An Integrated Framework for Parameter-based Optimization of Scientific Workflows.
Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel
2009-01-01
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
2011-04-30
developed the Knowledge Value Added + Systems Dynamics + Integrated Risk Management (KVA+SD+IRM) valuation framework to address these issues. KVA+SD...SD+IRM framework is used to quantify process cost savings and the potential benefits of selecting collab-PLM+3D TLS technology in the ship SHIPMAIN...The first section of this paper explicates the KVA+SD+IRM framework . In section two, a description of the SHIPMAIN program is provided. The third
Integrated Learning What--Why--How. Instructional Services Curriculum Series, Number 1.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh. Instructional Services.
Integrated learning refers to the interrelatedness of subject and skill areas within and across grades of a school program. A description is given of the framework for integrated learning programs developed by the state of North Carolina. This monograph addresses factors that influence efforts toward integrated learning as well as processes for…
FRIEND Engine Framework: a real time neurofeedback client-server system for neuroimaging studies
Basilio, Rodrigo; Garrido, Griselda J.; Sato, João R.; Hoefle, Sebastian; Melo, Bruno R. P.; Pamplona, Fabricio A.; Zahn, Roland; Moll, Jorge
2015-01-01
In this methods article, we present a new implementation of a recently reported FSL-integrated neurofeedback tool, the standalone version of “Functional Real-time Interactive Endogenous Neuromodulation and Decoding” (FRIEND). We will refer to this new implementation as the FRIEND Engine Framework. The framework comprises a client-server cross-platform solution for real time fMRI and fMRI/EEG neurofeedback studies, enabling flexible customization or integration of graphical interfaces, devices, and data processing. This implementation allows a fast setup of novel plug-ins and frontends, which can be shared with the user community at large. The FRIEND Engine Framework is freely distributed for non-commercial, research purposes. PMID:25688193
NASA Astrophysics Data System (ADS)
Linn, Marcia C.
1995-06-01
Designing effective curricula for complex topics and incorporating technological tools is an evolving process. One important way to foster effective design is to synthesize successful practices. This paper describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering. One course enhancement, the LISP Knowledge Integration Environment, improved learning and resulted in more gender-equitable outcomes. The second course enhancement, the spatial reasoning environment, addressed spatial reasoning in an introductory engineering course. This enhancement minimized the importance of prior knowledge of spatial reasoning and helped students develop a more comprehensive repertoire of spatial reasoning strategies. Taken together, the instructional research programs reinforce the value of the scaffolded knowledge integration framework and suggest directions for future curriculum reformers.
A cognitive perspective on health systems integration: results of a Canadian Delphi study
2014-01-01
Background Ongoing challenges to healthcare integration point toward the need to move beyond structural and process issues. While we know what needs to be done to achieve integrated care, there is little that informs us as to how. We need to understand how diverse organizations and professionals develop shared knowledge and beliefs – that is, we need to generate knowledge about normative integration. We present a cognitive perspective on integration, based on shared mental model theory, that may enhance our understanding and ability to measure and influence normative integration. The aim of this paper is to validate and improve the Mental Models of Integrated Care (MMIC) Framework, which outlines important knowledge and beliefs whose convergence or divergence across stakeholder groups may influence inter-professional and inter-organizational relations. Methods We used a two-stage web-based modified Delphi process to test the MMIC Framework against expert opinion using a random sample of participants from Canada’s National Symposium on Integrated Care. Respondents were asked to rate the framework’s clarity, comprehensiveness, usefulness, and importance using seven-point ordinal scales. Spaces for open comments were provided. Descriptive statistics were used to describe the structured responses, while open comments were coded and categorized using thematic analysis. The Kruskall-Wallis test was used to examine cross-group agreement by level of integration experience, current workplace, and current role. Results In the first round, 90 individuals responded (52% response rate), representing a wide range of professional roles and organization types from across the continuum of care. In the second round, 68 individuals responded (75.6% response rate). The quantitative and qualitative feedback from experts was used to revise the framework. The re-named “Integration Mindsets Framework” consists of a Strategy Mental Model and a Relationships Mental Model, comprising a total of nineteen content areas. Conclusions The Integration Mindsets Framework draws the attention of researchers and practitioners to how various stakeholders think about and conceptualize integration. A cognitive approach to understanding and measuring normative integration complements dominant cultural approaches and allows for more fine-grained analyses. The framework can be used by managers and leaders to facilitate the interpretation, planning, implementation, management and evaluation of integration initiatives. PMID:24885659
Integrated computational model of the bioenergetics of isolated lung mitochondria
Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855
Integrated computational model of the bioenergetics of isolated lung mitochondria.
Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.
The Bologna Process and Integration Theory: Convergence and Autonomy
ERIC Educational Resources Information Center
Barkholt, Kasper
2005-01-01
This paper focuses on two theoretical frameworks of integration (neo-functionalism and liberal inter-governmentalism), exploring their implications for current trends of integration in European higher education: the marketization of and trade in educational services, the involvement of supranational institutions, and the focus on quality…
Nordmark, Sofi; Zingmark, Karin; Lindberg, Inger
2016-04-27
Discharge planning is a care process that aims to secure the transfer of care for the patient at transition from home to the hospital and back home. Information exchange and collaboration between care providers are essential, but deficits are common. A wide range of initiatives to improve the discharge planning process have been developed and implemented for the past three decades. However, there are still high rates of reported medical errors and adverse events related to failures in the discharge planning. Using theoretical frameworks such as Normalization Process Theory (NPT) can support evaluations of complex interventions and processes in healthcare. The aim of this study was to explore the embedding and integration of the DPP from the perspective of registered nurses, district nurses and homecare organizers. The study design was explorative, using the NPT as a framework to explore the embedding and integration of the DPP. Data consisted of written documentation from; workshops with staff, registered adverse events and system failures, web based survey and individual interviews with staff. Using the NPT as a framework to explore the embedding and integration of discharge planning after 10 years in use showed that the staff had reached a consensus of opinion of what the process was (coherence) and how they evaluated the process (reflexive monitoring). However, they had not reached a consensus of opinion of who performed the process (cognitive participation) and how it was performed (collective action). This could be interpreted as the process had not become normalized in daily practice. The result shows necessity to observe the implementation of old practices to better understand the needs of new ones before developing and implementing new practices or supportive tools within healthcare to reach the aim of development and to accomplish sustainable implementation. The NPT offers a generalizable framework for analysis, which can explain and shape the implementation process of old practices, before further development of new practices or supportive tools.
NASA Astrophysics Data System (ADS)
Otsuka, Yuichi; Ohta, Kazuhide; Noguchi, Hiroshi
The 21st century Center of Excellence (COE) program in Department of Mechanical Engineering Science at Kyushu University construct the training framework of learning “Integrating Techniques” by research presentations for students in different majors and accident analyses for practical cases by Ph.D course students. The training framework is composed of three processes : 1) Peer review among Ph.D course students for the presentations, 2) Instructions by teachers in order to improve the quality of the presentations based on the result of the peer-reviews, 3) Final evaluation for the improved presentations by teachers and the students. This research has elucidated the quantitative effectiveness of the framework by the evaluations using questionnaires for the presentations. Furthermore, the result of investigation for the course students has observed positive correlation between the significance of integration techniques and the enthusiasm for participating the course, which reveals the efficacy of the learning framework proposed.
Baltussen, Rob; Jansen, Maarten Paul Maria; Bijlmakers, Leon; Grutters, Janneke; Kluytmans, Anouck; Reuzel, Rob P; Tummers, Marcia; der Wilt, Gert Jan van
2017-02-01
Priority setting in health care has been long recognized as an intrinsically complex and value-laden process. Yet, health technology assessment agencies (HTAs) presently employ value assessment frameworks that are ill fitted to capture the range and diversity of stakeholder values and thereby risk compromising the legitimacy of their recommendations. We propose "evidence-informed deliberative processes" as an alternative framework with the aim to enhance this legitimacy. This framework integrates two increasingly popular and complementary frameworks for priority setting: multicriteria decision analysis and accountability for reasonableness. Evidence-informed deliberative processes are, on one hand, based on early, continued stakeholder deliberation to learn about the importance of relevant social values. On the other hand, they are based on rational decision-making through evidence-informed evaluation of the identified values. The framework has important implications for how HTA agencies should ideally organize their processes. First, HTA agencies should take the responsibility of organizing stakeholder involvement. Second, agencies are advised to integrate their assessment and appraisal phases, allowing for the timely collection of evidence on values that are considered relevant. Third, HTA agencies should subject their decision-making criteria to public scrutiny. Fourth, agencies are advised to use a checklist of potentially relevant criteria and to provide argumentation for how each criterion affected the recommendation. Fifth, HTA agencies must publish their argumentation and install options for appeal. The framework should not be considered a blueprint for HTA agencies but rather an aspirational goal-agencies can take incremental steps toward achieving this goal. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
The brain, self and society: a social-neuroscience model of predictive processing.
Kelly, Michael P; Kriznik, Natasha M; Kinmonth, Ann Louise; Fletcher, Paul C
2018-05-10
This paper presents a hypothesis about how social interactions shape and influence predictive processing in the brain. The paper integrates concepts from neuroscience and sociology where a gulf presently exists between the ways that each describe the same phenomenon - how the social world is engaged with by thinking humans. We combine the concepts of predictive processing models (also called predictive coding models in the neuroscience literature) with ideal types, typifications and social practice - concepts from the sociological literature. This generates a unified hypothetical framework integrating the social world and hypothesised brain processes. The hypothesis combines aspects of neuroscience and psychology with social theory to show how social behaviors may be "mapped" onto brain processes. It outlines a conceptual framework that connects the two disciplines and that may enable creative dialogue and potential future research.
Combining Mechanistic Approaches for Studying Eco-Hydro-Geomorphic Coupling
NASA Astrophysics Data System (ADS)
Francipane, A.; Ivanov, V.; Akutina, Y.; Noto, V.; Istanbullouglu, E.
2008-12-01
Vegetation interacts with hydrology and geomorphic form and processes of a river basin in profound ways. Despite recent advances in hydrological modeling, the dynamic coupling between these processes is yet to be adequately captured at the basin scale to elucidate key features of process interaction and their role in the organization of vegetation and landscape morphology. In this study, we present a blueprint for integrating a geomorphic component into the physically-based, spatially distributed ecohydrological model, tRIBS- VEGGIE, which reproduces essential water and energy processes over the complex topography of a river basin and links them to the basic plant life regulatory processes. We present a preliminary design of the integrated modeling framework in which hillslope and channel erosion processes at the catchment scale, will be coupled with vegetation-hydrology dynamics. We evaluate the developed framework by applying the integrated model to Lucky Hills basin, a sub-catchment of the Walnut Gulch Experimental Watershed (Arizona). The evaluation is carried out by comparing sediment yields at the basin outlet, that follows a detailed verification of simulated land-surface energy partition, biomass dynamics, and soil moisture states.
An audience-channel-message-evaluation (ACME) framework for health communication campaigns.
Noar, Seth M
2012-07-01
Recent reviews of the literature have indicated that a number of health communication campaigns continue to fail to adhere to principles of effective campaign design. The lack of an integrated, organizing framework for the design, implementation, and evaluation of health communication campaigns may contribute to this state of affairs. The current article introduces an audience-channel-message-evaluation (ACME) framework that organizes the major principles of health campaign design, implementation, and evaluation. ACME also explicates the relationships and linkages between the varying principles. Insights from ACME include the following: The choice of audience segment(s) to focus on in a campaign affects all other campaign design choices, including message strategy and channel/component options. Although channel selection influences options for message design, choice of message design also influences channel options. Evaluation should not be thought of as a separate activity, but rather should be infused and integrated throughout the campaign design and implementation process, including formative, process, and outcome evaluation activities. Overall, health communication campaigns that adhere to this integrated set of principles of effective campaign design will have a greater chance of success than those using principles idiosyncratically. These design, implementation, and evaluation principles are embodied in the ACME framework.
Toward a New Paradigm: Governance in a Broader Framework.
ERIC Educational Resources Information Center
Deegan, William L.
1985-01-01
Argues that the issues and trends of the past decade make it necessary to reconsider governance processes and the way substantive issues are generated. Reviews major models for governance and proposes a broader, more integrated framework for analyzing governance issues. (DMM)
A template for integrated community sustainability planning.
Ling, Christopher; Hanna, Kevin; Dale, Ann
2009-08-01
This article describes a template for implementing an integrated community sustainability plan. The template emphasizes community engagement and outlines the components of a basic framework for integrating ecological, social and economic dynamics into a community plan. The framework is a series of steps that support a sustainable community development process. While it reflects the Canadian experience, the tools and techniques have applied value for a range of environmental planning contexts around the world. The research is case study based and draws from a diverse range of communities representing many types of infrastructure, demographics and ecological and geographical contexts. A critical path for moving local governments to sustainable community development is the creation and implementation of integrated planning approaches. To be effective and to be implemented, a requisite shift to sustainability requires active community engagement processes, political will, and a commitment to political and administrative accountability, and measurement.
Rajarathinam, Vetrickarthick; Chellappa, Swarnalatha; Nagarajan, Asha
2015-01-01
This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified.
Chellappa, Swarnalatha; Nagarajan, Asha
2015-01-01
This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified. PMID:25861688
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eccleston, C.H.
1997-09-05
The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less
Landscape pattern and ecological process in the Sierra Nevada
Dean L. Urban
2004-01-01
The Sierran Global Change Program in Sequoia-Kings Canyon and Yosemite National Parks includes a nearly decade-long integrated study of the interactions between climate, forest processes, and fire. This study is characterized by three recurring themes: (1) the use of systems-level models as a framework for integration and synthesis, (2) an effort to extrapolate an...
KC-135 Simulator Systems Engineering Case Study
2010-01-01
performance. The utilization and misutilization of SE principles are highlighted, with special emphasis on the conditions that foster and impede...process, from the identification of the need to the development and utilization of the product, must continuously integrate and optimize system and... utilizing the Friedman-Sage framework to organize the assessment of the application of the SE process. The framework and the derived matrix can
Emotion and the prefrontal cortex: An integrative review.
Dixon, Matthew L; Thiruchselvam, Ravi; Todd, Rebecca; Christoff, Kalina
2017-10-01
The prefrontal cortex (PFC) plays a critical role in the generation and regulation of emotion. However, we lack an integrative framework for understanding how different emotion-related functions are organized across the entire expanse of the PFC, as prior reviews have generally focused on specific emotional processes (e.g., decision making) or specific anatomical regions (e.g., orbitofrontal cortex). Additionally, psychological theories and neuroscientific investigations have proceeded largely independently because of the lack of a common framework. Here, we provide a comprehensive review of functional neuroimaging, electrophysiological, lesion, and structural connectivity studies on the emotion-related functions of 8 subregions spanning the entire PFC. We introduce the appraisal-by-content model, which provides a new framework for integrating the diverse range of empirical findings. Within this framework, appraisal serves as a unifying principle for understanding the PFC's role in emotion, while relative content-specialization serves as a differentiating principle for understanding the role of each subregion. A synthesis of data from affective, social, and cognitive neuroscience studies suggests that different PFC subregions are preferentially involved in assigning value to specific types of inputs: exteroceptive sensations, episodic memories and imagined future events, viscero-sensory signals, viscero-motor signals, actions, others' mental states (e.g., intentions), self-related information, and ongoing emotions. We discuss the implications of this integrative framework for understanding emotion regulation, value-based decision making, emotional salience, and refining theoretical models of emotion. This framework provides a unified understanding of how emotional processes are organized across PFC subregions and generates new hypotheses about the mechanisms underlying adaptive and maladaptive emotional functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Nomaguch, Yutaka; Fujita, Kikuo
This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.
Working toward integrated models of alpine plant distribution
Carlson, Bradley Z.; Randin, Christophe F.; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe
2014-01-01
Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial–temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution. PMID:24790594
Model-based analysis of pattern motion processing in mouse primary visual cortex
Muir, Dylan R.; Roth, Morgane M.; Helmchen, Fritjof; Kampa, Björn M.
2015-01-01
Neurons in sensory areas of neocortex exhibit responses tuned to specific features of the environment. In visual cortex, information about features such as edges or textures with particular orientations must be integrated to recognize a visual scene or object. Connectivity studies in rodent cortex have revealed that neurons make specific connections within sub-networks sharing common input tuning. In principle, this sub-network architecture enables local cortical circuits to integrate sensory information. However, whether feature integration indeed occurs locally in rodent primary sensory areas has not been examined directly. We studied local integration of sensory features in primary visual cortex (V1) of the mouse by presenting drifting grating and plaid stimuli, while recording the activity of neuronal populations with two-photon calcium imaging. Using a Bayesian model-based analysis framework, we classified single-cell responses as being selective for either individual grating components or for moving plaid patterns. Rather than relying on trial-averaged responses, our model-based framework takes into account single-trial responses and can easily be extended to consider any number of arbitrary predictive models. Our analysis method was able to successfully classify significantly more responses than traditional partial correlation (PC) analysis, and provides a rigorous statistical framework to rank any number of models and reject poorly performing models. We also found a large proportion of cells that respond strongly to only one stimulus class. In addition, a quarter of selectively responding neurons had more complex responses that could not be explained by any simple integration model. Our results show that a broad range of pattern integration processes already take place at the level of V1. This diversity of integration is consistent with processing of visual inputs by local sub-networks within V1 that are tuned to combinations of sensory features. PMID:26300738
Automated UAV-based video exploitation using service oriented architecture framework
NASA Astrophysics Data System (ADS)
Se, Stephen; Nadeau, Christian; Wood, Scott
2011-05-01
Airborne surveillance and reconnaissance are essential for successful military missions. Such capabilities are critical for troop protection, situational awareness, mission planning, damage assessment, and others. Unmanned Aerial Vehicles (UAVs) gather huge amounts of video data but it is extremely labour-intensive for operators to analyze hours and hours of received data. At MDA, we have developed a suite of tools that can process the UAV video data automatically, including mosaicking, change detection and 3D reconstruction, which have been integrated within a standard GIS framework. In addition, the mosaicking and 3D reconstruction tools have also been integrated in a Service Oriented Architecture (SOA) framework. The Visualization and Exploitation Workstation (VIEW) integrates 2D and 3D visualization, processing, and analysis capabilities developed for UAV video exploitation. Visualization capabilities are supported through a thick-client Graphical User Interface (GUI), which allows visualization of 2D imagery, video, and 3D models. The GUI interacts with the VIEW server, which provides video mosaicking and 3D reconstruction exploitation services through the SOA framework. The SOA framework allows multiple users to perform video exploitation by running a GUI client on the operator's computer and invoking the video exploitation functionalities residing on the server. This allows the exploitation services to be upgraded easily and allows the intensive video processing to run on powerful workstations. MDA provides UAV services to the Canadian and Australian forces in Afghanistan with the Heron, a Medium Altitude Long Endurance (MALE) UAV system. On-going flight operations service provides important intelligence, surveillance, and reconnaissance information to commanders and front-line soldiers.
Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon
2014-01-01
Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.
OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing
NASA Astrophysics Data System (ADS)
Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping
2017-02-01
The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.
Fox, W.E.; McCollum, D.W.; Mitchell, J.E.; Swanson, L.E.; Kreuter, U.P.; Tanaka, J.A.; Evans, G.R.; Theodore, Heintz H.; Breckenridge, R.P.; Geissler, P.H.
2009-01-01
Currently, there is no standard method to assess the complex systems in rangeland ecosystems. Decision makers need baselines to create a common language of current rangeland conditions and standards for continued rangeland assessment. The Sustainable Rangeland Roundtable (SRR), a group of private and public organizations and agencies, has created a forum to discuss rangeland sustainability and assessment. The SRR has worked to integrate social, economic, and ecological disciplines related to rangelands and has identified a standard set of indicators that can be used to assess rangeland sustainability. As part of this process, SRR has developed a two-tiered conceptual framework from a systems perspective to study the validity of indicators and the relationships among them. The first tier categorizes rangeland characteristics into four states. The second tier defines processes affecting these states through time and space. The framework clearly shows that the processes affect and are affected by each other. ?? 2009 Taylor & Francis Group, LLC.
A Proactive and Top-Down Approach to Managing Risk at NASA
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon
2010-01-01
Our ultimate goal is to manage risk in a holistic and coherent fashion across the Agency: a) The RIDM process is intended to risk-inform direction-setting decisions. c) The CRM process is intended to manage risk associated with the implementation of baseline performance requirements. Currently we are working on: a) Enhancements to the CRM process. b) Better integration of the RIDM and CRM processes. c) Better integration of institutional risk considerations into RM framework.
Principles for ecologically based invasive plant management
Jeremy J. James; Brenda S. Smith; Edward A. Vasquez; Roger L. Sheley
2010-01-01
Land managers have long identified a critical need for a practical and effective framework for designing restoration strategies, especially where invasive plants dominate. A holistic, ecologically based, invasive plant management (EBIPM) framework that integrates ecosystem health assessment, knowledge of ecological processes, and adaptive management into a successional...
Educational Communities of Inquiry: Theoretical Framework, Research and Practice
ERIC Educational Resources Information Center
Akyol, Zehra; Garrison, D. Randy
2013-01-01
Communications technologies have been continuously integrated into learning and training environments which has revealed the need for a clear understanding of the process. The Community of Inquiry (COI) Theoretical Framework has a philosophical foundation which provides planned guidelines and principles to development useful learning environments…
Bringing ecosystem services into integrated water resources management.
Liu, Shuang; Crossman, Neville D; Nolan, Martin; Ghirmay, Hiyoba
2013-11-15
In this paper we propose an ecosystem service framework to support integrated water resource management and apply it to the Murray-Darling Basin in Australia. Water resources in the Murray-Darling Basin have been over-allocated for irrigation use with the consequent degradation of freshwater ecosystems. In line with integrated water resource management principles, Australian Government reforms are reducing the amount of water diverted for irrigation to improve ecosystem health. However, limited understanding of the broader benefits and trade-offs associated with reducing irrigation diversions has hampered the planning process supporting this reform. Ecosystem services offer an integrative framework to identify the broader benefits associated with integrated water resource management in the Murray-Darling Basin, thereby providing support for the Government to reform decision-making. We conducted a multi-criteria decision analysis for ranking regional potentials to provide ecosystem services at river basin scale. We surveyed the wider public about their understanding of, and priorities for, managing ecosystem services and then integrated the results with spatially explicit indicators of ecosystem service provision. The preliminary results of this work identified the sub-catchments with the greatest potential synergies and trade-offs of ecosystem service provision under the integrated water resources management reform process. With future development, our framework could be used as a decision support tool by those grappling with the challenge of the sustainable allocation of water between irrigation and the environment. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Axdahl, Erik L.
2015-01-01
Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.
An Activity Theory Approach to Research of ICT Integration in Singapore Schools
ERIC Educational Resources Information Center
Lim, Cher Ping; Hang, David
2003-01-01
This paper explains how activity theory is used as a framework to study the information and communication technologies (ICT) integration processes in Singapore schools, both from the sociocultural and pedagogical perspectives. The research study addresses the pertinent question of "How has ICT been integrated in Singapore schools such that…
The Webinar Integration Tool: A Framework for Promoting Active Learning in Blended Environments
ERIC Educational Resources Information Center
Lieser, Ping; Taf, Steven D.; Murphy-Hagan, Anne
2018-01-01
This paper describes a three-stage process of developing a webinar integration tool to enhance the interaction of teaching and learning in blended environments. In the context of medical education, we emphasize three factors of effective webinar integration in blended learning: fostering better solutions for faculty and students to interact…
Integrating Evidence Within and Across Evidence Streams Using Qualitative Methods
There is high demand in environmental health for adoption of a structured process that evaluates and integrates evidence while making decisions transparent. The Grading of Recommendations Assessment, Development and Evaluation (GRADE) framework holds promise to address this deman...
A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.
2013-10-01
Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integratedmore » into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less
Genetic Programming for Automatic Hydrological Modelling
NASA Astrophysics Data System (ADS)
Chadalawada, Jayashree; Babovic, Vladan
2017-04-01
One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach for conceptual hydrological modeling: 1. Motivation and theoretical development, Water Resources Research, 47(11).
Jones, Catherine M; Clavier, Carole; Potvin, Louise
2017-03-01
National policies on global health appear as one way that actors from health, development and foreign affairs sectors in a country coordinate state action on global health. Next to a burgeoning literature in which international relations and global governance theories are employed to understand global health policy and global health diplomacy at the international level, little is known about policy processes for global health at the national scale. We propose a framework of the policy process to understand how such policies are developed, and we identify challenges for public health researchers integrating conceptual tools from political science. We developed the framework using a two-step process: 1) reviewing literature to establish criteria for selecting a theoretical framework fit for this purpose, and 2) adapting Real-Dato's synthesis framework to integrate a cognitive approach to public policy within a constructivist perspective. Our framework identifies multiple contexts as part of the policy process, focuses on situations where actors work together to make national policy on global health, considers these interactive situations as spaces for observing external influences on policy change and proposes policy design as the output of the process. We suggest that this framework makes three contributions to the conceptualisation of national policy on global health as a research object. First, it emphasizes collective action over decisions of individual policy actors. Second, it conceptualises the policy process as organised interactive spaces for collaboration rather than as stages of a policy cycle. Third, national decision-making spaces are opportunities for transferring ideas and knowledge from different sectors and settings, and represent opportunities to identify international influences on a country's global health policy. We discuss two sets of challenges for public health researchers using interdisciplinary approaches in policy research. Copyright © 2017 Elsevier Ltd. All rights reserved.
Energy Systems Integration: Data Call -- Become a Data Partner
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-01-01
This project aims to advance the understanding of costs associated with integrating PV onto the electric power distribution system while maintaining reliable grid operations. We have developed a bottom-up framework for calculating these costs as a function of PV penetration levels on specific feeders. This framework will used to inform and improve utility planning decisions, increase the transparency and speed associated with the interconnection process, and provide policymakers with more information on the total cost of energy from PV.
Analytical framework and tool kit for SEA follow-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran
2009-04-15
Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less
An Integrated Conceptual Framework for the Development of Asian American Children and Youth.
Mistry, Jayanthi; Li, Jin; Yoshikawa, Hirokazu; Tseng, Vivian; Tirrell, Jonathan; Kiang, Lisa; Mistry, Rashmita; Wang, Yijie
2016-07-01
The diversity of circumstances and developmental outcomes among Asian American children and youth poses a challenge for scholars interested in Asian American child development. This article addresses the challenge by offering an integrated conceptual framework based on three broad questions: (a) What are theory-predicated specifications of contexts that are pertinent for the development of Asian American children? (b) What are the domains of development and socialization that are particularly relevant? (c) How can culture as meaning-making processes be integrated in conceptualizations of development? The heuristic value of the conceptual model is illustrated by research on Asian American children and youth that examines the interconnected nature of specific features of context, pertinent aspects of development, and interpretive processes. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.
Modeling formalisms in Systems Biology
2011-01-01
Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422
ERIC Educational Resources Information Center
Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda
2016-01-01
Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…
Community College Management by Objectives: Process, Progress, Problems.
ERIC Educational Resources Information Center
Deegan, William L.; And Others
The objectives of this book are: (1) to present a theoretical framework for management by objectives in community colleges, (2) to present information about alternative methods for conducting needs assessment and implementing management by objectives, (3) to present a framework for integrating academic and fiscal planning through management by…
HCI∧2 framework: a software framework for multimodal human-computer interaction systems.
Shen, Jie; Pantic, Maja
2013-12-01
This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a shared-memory-based data transport protocol for message delivery and a TCP-based system management protocol. The latter ensures that the integrity of system structure is maintained at runtime. With the inclusion of bridging modules, the HCI∧2 Framework is interoperable with other software frameworks including Psyclone and ActiveMQ. In addition to the core communication middleware, we also present the integrated development environment (IDE) of the HCI∧2 Framework. It provides a complete graphical environment to support every step in a typical MHCI system development process, including module development, debugging, packaging, and management, as well as the whole system management and testing. The quantitative evaluation indicates that our framework outperforms other similar tools in terms of average message latency and maximum data throughput under a typical single PC scenario. To demonstrate HCI∧2 Framework's capabilities in integrating heterogeneous modules, we present several example modules working with a variety of hardware and software. We also present an example of a full system developed using the proposed HCI∧2 Framework, which is called the CamGame system and represents a computer game based on hand-held marker(s) and low-cost camera(s).
Kukona, Anuenue; Cho, Pyeong Whan; Magnuson, James S.; Tabor, Whitney
2014-01-01
Psycholinguistic research spanning a number of decades has produced diverging results with regard to the nature of constraint integration in online sentence processing. For example, evidence that language users anticipatorily fixate likely upcoming referents in advance of evidence in the speech signal supports rapid context integration. By contrast, evidence that language users activate representations that conflict with contextual constraints, or only indirectly satisfy them, supports non-integration or late integration. Here, we report on a self-organizing neural network framework that addresses one aspect of constraint integration: the integration of incoming lexical information (i.e., an incoming word) with sentence context information (i.e., from preceding words in an unfolding utterance). In two simulations, we show that the framework predicts both classic results concerned with lexical ambiguity resolution (Swinney, 1979; Tanenhaus, Leiman, & Seidenberg, 1979), which suggest late context integration, and results demonstrating anticipatory eye movements (e.g., Altmann & Kamide, 1999), which support rapid context integration. We also report two experiments using the visual world paradigm that confirm a new prediction of the framework. Listeners heard sentences like “The boy will eat the white…,” while viewing visual displays with objects like a white cake (i.e., a predictable direct object of “eat”), white car (i.e., an object not predicted by “eat,” but consistent with “white”), and distractors. Consistent with our simulation predictions, we found that while listeners fixated white cake most, they also fixated white car more than unrelated distractors in this highly constraining sentence (and visual) context. PMID:24245535
Integrating Personalized and Community Services for Mobile Travel Planning and Management
NASA Astrophysics Data System (ADS)
Yu, Chien-Chih
Personalized and community services have been noted as keys to enhance and facilitate e-tourism as well as mobile applications. This paper aims at proposing an integrated service framework for combining personalized and community functions to support mobile travel planning and management. Major mobile tourism related planning and decision support functions specified include personalized profile management, information search and notification, evaluation and recommendation, do-it-yourself planning and design, community and collaboration management, auction and negotiation, transaction and payment, as well as trip tracking and quality control. A system implementation process with an example prototype is also presented for illustrating the feasibility and effectiveness of the proposed system framework, process model, and development methodology.
Hendriks, Anna-Marie; Jansen, Maria W J; Gubbels, Jessica S; De Vries, Nanne K; Paulussen, Theo; Kremers, Stef P J
2013-04-18
Childhood obesity is a 'wicked' public health problem that is best tackled by an integrated approach, which is enabled by integrated public health policies. The development and implementation of such policies have in practice proven to be difficult, however, and studying why this is the case requires a tool that may assist local policy-makers and those assisting them. A comprehensive framework that can help to identify options for improvement and to systematically develop solutions may be used to support local policy-makers. We propose the 'Behavior Change Ball' as a tool to study the development and implementation of integrated public health policies within local government. Based on the tenets of the 'Behavior Change Wheel' by Michie and colleagues (2011), the proposed conceptual framework distinguishes organizational behaviors of local policy-makers at the strategic, tactical and operational levels, as well as the determinants (motivation, capability, opportunity) required for these behaviors, and interventions and policy categories that can influence them. To illustrate the difficulty of achieving sustained integrated approaches, we use the metaphor of a ball in our framework: the mountainous landscapes surrounding the ball reflect the system's resistance to change (by making it difficult for the ball to roll). We apply this framework to the problem of childhood obesity prevention. The added value provided by the framework lies in its comprehensiveness, theoretical basis, diagnostic and heuristic nature and face validity. Since integrated public health policies have not been widely developed and implemented in practice, organizational behaviors relevant to the development of these policies remain to be investigated. A conceptual framework that can assist in systematically studying the policy process may facilitate this. Our Behavior Change Ball adds significant value to existing public health policy frameworks by incorporating multiple theoretical perspectives, specifying a set of organizational behaviors and linking the analysis of these behaviors to interventions and policies. We would encourage examination by others of our framework as a tool to explain and guide the development of integrated policies for the prevention of wicked public health problems.
2013-01-01
Background Childhood obesity is a ‘wicked’ public health problem that is best tackled by an integrated approach, which is enabled by integrated public health policies. The development and implementation of such policies have in practice proven to be difficult, however, and studying why this is the case requires a tool that may assist local policy-makers and those assisting them. A comprehensive framework that can help to identify options for improvement and to systematically develop solutions may be used to support local policy-makers. Discussion We propose the ‘Behavior Change Ball’ as a tool to study the development and implementation of integrated public health policies within local government. Based on the tenets of the ‘Behavior Change Wheel’ by Michie and colleagues (2011), the proposed conceptual framework distinguishes organizational behaviors of local policy-makers at the strategic, tactical and operational levels, as well as the determinants (motivation, capability, opportunity) required for these behaviors, and interventions and policy categories that can influence them. To illustrate the difficulty of achieving sustained integrated approaches, we use the metaphor of a ball in our framework: the mountainous landscapes surrounding the ball reflect the system’s resistance to change (by making it difficult for the ball to roll). We apply this framework to the problem of childhood obesity prevention. The added value provided by the framework lies in its comprehensiveness, theoretical basis, diagnostic and heuristic nature and face validity. Summary Since integrated public health policies have not been widely developed and implemented in practice, organizational behaviors relevant to the development of these policies remain to be investigated. A conceptual framework that can assist in systematically studying the policy process may facilitate this. Our Behavior Change Ball adds significant value to existing public health policy frameworks by incorporating multiple theoretical perspectives, specifying a set of organizational behaviors and linking the analysis of these behaviors to interventions and policies. We would encourage examination by others of our framework as a tool to explain and guide the development of integrated policies for the prevention of wicked public health problems. PMID:23597122
A new web-based framework development for fuzzy multi-criteria group decision-making.
Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik
2016-01-01
Fuzzy multi-criteria group decision making (FMCGDM) process is usually used when a group of decision-makers faces imprecise data or linguistic variables to solve the problems. However, this process contains many methods that require many time-consuming calculations depending on the number of criteria, alternatives and decision-makers in order to reach the optimal solution. In this study, a web-based FMCGDM framework that offers decision-makers a fast and reliable response service is proposed. The proposed framework includes commonly used tools for multi-criteria decision-making problems such as fuzzy Delphi, fuzzy AHP and fuzzy TOPSIS methods. The integration of these methods enables taking advantages of the strengths and complements each method's weakness. Finally, a case study of location selection for landfill waste in Morocco is performed to demonstrate how this framework can facilitate decision-making process. The results demonstrate that the proposed framework can successfully accomplish the goal of this study.
Research environments that promote integrity.
Jeffers, Brenda Recchia; Whittemore, Robin
2005-01-01
The body of empirical knowledge about research integrity and the factors that promote research integrity in nursing research environments remains small. To propose an internal control model as an innovative framework for the design and structure of nursing research environments that promote integrity. An internal control model is adapted to illustrate its use for conceptualizing and designing research environments that promote integrity. The internal control model integrates both the organizational elements necessary to promote research integrity and the processes needed to assess research environments. The model provides five interrelated process components within which any number of research integrity variables and processes may be used and studied: internal control environment, risk assessment, internal control activities, monitoring, and information and communication. The components of the proposed research integrity internal control model proposed comprise an integrated conceptualization of the processes that provide reasonable assurance that research integrity will be promoted within the nursing research environment. Schools of nursing can use the model to design, implement, and evaluate systems that promote research integrity. The model process components need further exploration to substantiate the use of the model in nursing research environments.
The Policy Formation Process: A Conceptual Framework for Analysis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Fuchs, E. F.
1972-01-01
A conceptual framework for analysis which is intended to assist both the policy analyst and the policy researcher in their empirical investigations into policy phenomena is developed. It is meant to facilitate understanding of the policy formation process by focusing attention on the basic forces shaping the main features of policy formation as a dynamic social-political-organizational process. The primary contribution of the framework lies in its capability to suggest useful ways of looking at policy formation reality. It provides the analyst and the researcher with a group of indicators which suggest where to look and what to look for when attempting to analyze and understand the mix of forces which energize, maintain, and direct the operation of strategic level policy systems. The framework also highlights interconnections, linkage, and relational patterns between and among important variables. The framework offers an integrated set of conceptual tools which facilitate understanding of and research on the complex and dynamic set of variables which interact in any major strategic level policy formation process.
Two Inseparable Facets of Technology Integration Programs: Technology and Theoretical Framework
ERIC Educational Resources Information Center
Demir, Servet
2011-01-01
This paper considers the process of program development aiming at technology integration for teachers. For this consideration, the paper focused on an integration program which was recently developed as part of a larger project. The participants of this program were 45 in-service teachers. The program continued four weeks and the conduct of the…
A Bayesian Framework of Uncertainties Integration in 3D Geological Model
NASA Astrophysics Data System (ADS)
Liang, D.; Liu, X.
2017-12-01
3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.
Guidelines for performing systematic reviews in the development of toxicity factors.
Schaefer, Heather R; Myers, Jessica L
2017-12-01
The Texas Commission on Environmental Quality (TCEQ) developed guidance on conducting systematic reviews during the development of chemical-specific toxicity factors. Using elements from publicly available frameworks, the TCEQ systematic review process was developed in order to supplement the existing TCEQ Guidelines for developing toxicity factors (TCEQ Regulatory Guidance 442). The TCEQ systematic review process includes six steps: 1) Problem Formulation; 2) Systematic Literature Review and Study Selection; 3) Data Extraction; 4) Study Quality and Risk of Bias Assessment; 5) Evidence Integration and Endpoint Determination; and 6) Confidence Rating. This document provides guidance on conducting a systematic literature review and integrating evidence from different data streams when developing chemical-specific reference values (ReVs) and unit risk factors (URFs). However, this process can also be modified or expanded to address other questions that would benefit from systematic review practices. The systematic review and evidence integration framework can improve regulatory decision-making processes, increase transparency, minimize bias, improve consistency between different risk assessments, and further improve confidence in toxicity factor development. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
A service-based framework for pharmacogenomics data integration
NASA Astrophysics Data System (ADS)
Wang, Kun; Bai, Xiaoying; Li, Jing; Ding, Cong
2010-08-01
Data are central to scientific research and practices. The advance of experiment methods and information retrieval technologies leads to explosive growth of scientific data and databases. However, due to the heterogeneous problems in data formats, structures and semantics, it is hard to integrate the diversified data that grow explosively and analyse them comprehensively. As more and more public databases are accessible through standard protocols like programmable interfaces and Web portals, Web-based data integration becomes a major trend to manage and synthesise data that are stored in distributed locations. Mashup, a Web 2.0 technique, presents a new way to compose content and software from multiple resources. The paper proposes a layered framework for integrating pharmacogenomics data in a service-oriented approach using the mashup technology. The framework separates the integration concerns from three perspectives including data, process and Web-based user interface. Each layer encapsulates the heterogeneous issues of one aspect. To facilitate the mapping and convergence of data, the ontology mechanism is introduced to provide consistent conceptual models across different databases and experiment platforms. To support user-interactive and iterative service orchestration, a context model is defined to capture information of users, tasks and services, which can be used for service selection and recommendation during a dynamic service composition process. A prototype system is implemented and cases studies are presented to illustrate the promising capabilities of the proposed approach.
JAMS - a software platform for modular hydrological modelling
NASA Astrophysics Data System (ADS)
Kralisch, Sven; Fischer, Christian
2015-04-01
Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.
Explicit and Implicit Emotion Regulation: A Dual-Process Framework
Gyurak, Anett; Gross, James J.; Etkin, Amit
2012-01-01
It is widely acknowledged that emotions can be regulated in an astonishing variety of ways. Most research to date has focused on explicit (effortful) forms of emotion regulation. However, there is growing research interest in implicit (automatic) forms of emotion regulation. To organize emerging findings, we present a dual-process framework that integrates explicit and implicit forms of emotion regulation, and argue that both forms of regulation are necessary for well-being. In the first section of this review, we provide a broad overview of the construct of emotion regulation, with an emphasis on explicit and implicit processes. In the second section, we focus on explicit emotion regulation, considering both neural mechanisms that are associated with these processes and their experiential and physiological consequences. In the third section, we turn to several forms of implicit emotion regulation, and integrate the burgeoning literature in this area. We conclude by outlining open questions and areas for future research. PMID:21432682
Translational Scholarship and a Palliative Approach: Enlisting the Knowledge-As-Action Framework.
Reimer-Kirkham, Sheryl; Doane, Gweneth Hartrick; Antifeau, Elisabeth; Pesut, Barbara; Porterfield, Pat; Roberts, Della; Stajduhar, Kelli; Wikjord, Nicole
2015-01-01
Based on a retheorized epistemology for knowledge translation (KT) that problematizes the "know-do gap" and conceptualizes the knower, knowledge, and action as inseparable, this paper describes the application of the Knowledge-As-Action Framework. When applied as a heuristic device to support an inquiry process, the framework with the metaphor of a kite facilitates a responsiveness to the complexities that characterize KT. Examples from a KT demonstration project on the integration of a palliative approach at 3 clinical sites illustrate the interrelatedness of 6 dimensions-the local context, processes, people, knowledge, fluctuating realities, and values.
Miller, Benjamin F; Mendenhall, Tai J; Malik, Alan D
2009-03-01
Integrating behavioral health services within the primary care setting drives higher levels of collaborative care, and is proving to be an essential part of the solution for our struggling American healthcare system. However, justification for implementing and sustaining integrated and collaborative care has shown to be a formidable task. In an attempt to move beyond conflicting terminology found in the literature, we delineate terms and suggest a standardized nomenclature. Further, we maintain that addressing the three principal worlds of healthcare (clinical, operational, financial) is requisite in making sense of the spectrum of available implementations and ultimately transitioning collaborative care into the mainstream. Using a model that deconstructs process metrics into factors/barriers and generalizes behavioral health provider roles into major categories provides a framework to empirically discriminate between implementations across specific settings. This approach offers practical guidelines for care sites implementing integrated and collaborative care and defines a research framework to produce the evidence required for the aforementioned clinical, operational and financial worlds of this important movement.
Borzacchiello, Maria Teresa; Torrieri, Vincenzo; Nijkamp, Peter
2009-11-01
This paper offers the description of an integrated information system framework for the assessment of transportation planning and management. After an introductory exposition, in the first part of the paper, a broad overview of international experiences regarding information systems on transportation is given, focusing in particular on the relationship between transportation system's performance monitoring and the decision-making process, and on the importance of this connection in the evaluation and planning process, in Italian and European cases. Next, the methodological design of an information system to support efficient and sustainable transportation planning and management aiming to integrate inputs from several different data sources is presented. The resulting framework deploys modular and integrated databases which include data stemming from different national or regional data banks and which integrate information belonging to different transportation fields. For this reason, it allows public administrations to account for many strategic elements that influence their decisions regarding transportation, both from a systemic and infrastructural point of view.
An integrative process model of leadership: examining loci, mechanisms, and event cycles.
Eberly, Marion B; Johnson, Michael D; Hernandez, Morela; Avolio, Bruce J
2013-09-01
Utilizing the locus (source) and mechanism (transmission) of leadership framework (Hernandez, Eberly, Avolio, & Johnson, 2011), we propose and examine the application of an integrative process model of leadership to help determine the psychological interactive processes that constitute leadership. In particular, we identify the various dynamics involved in generating leadership processes by modeling how the loci and mechanisms interact through a series of leadership event cycles. We discuss the major implications of this model for advancing an integrative understanding of what constitutes leadership and its current and future impact on the field of psychological theory, research, and practice. © 2013 APA, all rights reserved.
Machine listening intelligence
NASA Astrophysics Data System (ADS)
Cella, C. E.
2017-05-01
This manifesto paper will introduce machine listening intelligence, an integrated research framework for acoustic and musical signals modelling, based on signal processing, deep learning and computational musicology.
Berntsen, Gro; Høyem, Audhild; Lettrem, Idar; Ruland, Cornelia; Rumpsfeld, Markus; Gammon, Deede
2018-06-20
Person-Centered Integrated Care (PC-IC) is believed to improve outcomes and experience for persons with multiple long-term and complex conditions. No broad consensus exists regarding how to capture the patient-experienced quality of PC-IC. Most PC-IC evaluation tools focus on care events or care in general. Building on others' and our previous work, we outlined a 4-stage goal-oriented PC-IC process ideal: 1) Personalized goal setting 2) Care planning aligned with goals 3) Care delivery according to plan, and 4) Evaluation of goal attainment. We aimed to explore, apply, refine and operationalize this quality of care framework. This paper is a qualitative evaluative review of the individual Patient Pathways (iPP) experiences of 19 strategically chosen persons with multimorbidity in light of ideals for chronic care. The iPP includes all care events, addressing the persons collected health issues, organized by time. We constructed iPPs based on the electronic health record (from general practice, nursing services, and hospital) with patient follow-up interviews. The application of the framework and its refinement were parallel processes. Both were based on analysis of salient themes in the empirical material in light of the PC-IC process ideal and progressively more informed applications of themes and questions. The informants consistently reviewed care quality by how care supported/ threatened their long-term goals. Personal goals were either implicit or identified by "What matters to you?" Informants expected care to address their long-term goals and placed responsibility for care quality and delivery at the system level. The PC-IC process framework exposed system failure in identifying long-term goals, provision of shared long-term multimorbidity care plans, monitoring of care delivery and goal evaluation. The PC-IC framework includes descriptions of ideal care, key questions and literature references for each stage of the PC-IC process. This first version of a PC-IC process framework needs further validation in other settings. Gaps in care that are invisible with event-based quality of care frameworks become apparent when evaluated by a long-term goal-driven PC-IC process framework. The framework appears meaningful to persons with multimorbidity.
Integrating consumer engagement in health and medical research - an Australian framework.
Miller, Caroline L; Mott, Kathy; Cousins, Michael; Miller, Stephanie; Johnson, Anne; Lawson, Tony; Wesselingh, Steve
2017-02-10
Quality practice of consumer engagement is still in its infancy in many sectors of medical research. The South Australian Health and Medical Research Institute (SAHMRI) identified, early in its development, the opportunity to integrate evidence-driven consumer and community engagement into its operations. SAHMRI partnered with Health Consumers Alliance and consumers in evidence generation. A Partnership Steering Committee of researchers and consumers was formed for the project. An iterative mixed-method qualitative process was used to generate a framework for consumer engagement. This process included a literature review followed by semi-structured interviews with experts in consumer engagement and lead medical researchers, group discussions and a consensus workshop with the Partnership Steering Committee, facilitated by Health Consumer Alliance. The literature revealed a dearth of evidence about effective consumer engagement methodologies. Four organisational dimensions are reported to contribute to success, namely governance, infrastructure, capacity and advocacy. Key themes identified through the stakeholder interviews included sustained leadership, tangible benefits, engagement strategies should be varied, resourcing, a moral dimension, and challenges. The consensus workshop produced a framework and tangible strategies. Comprehensive examples of consumer participation in health and medical research are limited. There are few documented studies of what techniques are effective. This evidence-driven framework, developed in collaboration with consumers, is being integrated in a health and medical research institute with diverse programs of research. This framework is offered as a contribution to the evidence base around meaningful consumer engagement and as a template for other research institutions to utilise.
Word Knowledge in a Theory of Reading Comprehension
ERIC Educational Resources Information Center
Perfetti, Charles; Stafura, Joseph
2014-01-01
We reintroduce a wide-angle view of reading comprehension, the Reading Systems Framework, which places word knowledge in the center of the picture, taking into account the progress made in comprehension research and theory. Within this framework, word-to-text integration processes can serve as a model for the study of local comprehension…
21st Century Pedagogical Content Knowledge and Science Teaching and Learning
ERIC Educational Resources Information Center
Slough, Scott; Chamblee, Gregory
2017-01-01
Technological Pedagogical Content Knowledge (TPACK) is a theoretical framework that has enjoyed widespread applications as it applies to the integration of technology in the teaching and learning process. This paper reviews the background for TPACK, discusses some of its limitations, and reviews and introduces a new theoretical framework, 21st…
Moloczij, Natasha; Gough, Karla; Solomon, Benjamin; Ball, David; Mileshkin, Linda; Duffy, Mary; Krishnasamy, Mei
2018-01-11
Patient-reported outcome (PRO) data is central to the delivery of quality health care. Establishing sustainable, reliable and cost-efficient methods for routine collection and integration of PRO data into health information systems is challenging. This protocol paper describes the design and structure of a study to develop and pilot test a PRO framework to systematically and longitudinally collect PRO data from a cohort of lung cancer patients at a comprehensive cancer centre in Australia. Best-practice guidelines for developing registries aimed at collecting PROs informed the development of this PRO framework. Framework components included: achieving consensus on determining the purpose of the framework, the PRO measures to be included, the data collection time points and collection methods (electronic and paper), establishing processes to safeguard the quality of the data collected and to link the PRO framework to an existing hospital-based lung cancer clinical registry. Lung cancer patients will be invited to give feedback on the PRO measures (PROMs) chosen and the data collection time points and methods. Implementation of the framework will be piloted for 12 months. Then a mixed-methods approach used to explore patient and multidisciplinary perspectives on the feasibility of implementing the framework and linking it to the lung cancer clinical registry, its clinical utility, perceptions of data collection burden, and preliminary assessment of resource costs to integrate, implement and sustain the PRO framework. The PRO data set will include: a quality of life questionnaire (EORTC-QLQ-C30) and the EORTC lung cancer specific module (QLQC-LC-13). These will be collected pre-treatment (baseline), 2, 6 and 12 months post-baseline. Also, four social isolation questions (PROMIS) will be collected at baseline. Identifying and deciding on the overall purpose, clinical utility of data and which PROs to collect from patients requires careful consideration. Our study will explore how PRO data collection processes that link to a clinical data set can be developed and integrated; how PRO systems that are easy for patients to complete and professionals to use in practice can be achieved, and will provide indicative costs of developing and integrating a longitudinal PRO framework into routine hospital data collection systems. This study is not a clinical trial and is therefore not registered in any trial registry. However, it has received human research ethics approval (LNR/16/PMCC/45).
Multicriteria framework for selecting a process modelling language
NASA Astrophysics Data System (ADS)
Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel
2016-01-01
The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.
A Dynamic Bayesian Observer Model Reveals Origins of Bias in Visual Path Integration.
Lakshminarasimhan, Kaushik J; Petsalis, Marina; Park, Hyeshin; DeAngelis, Gregory C; Pitkow, Xaq; Angelaki, Dora E
2018-06-20
Path integration is a strategy by which animals track their position by integrating their self-motion velocity. To identify the computational origins of bias in visual path integration, we asked human subjects to navigate in a virtual environment using optic flow and found that they generally traveled beyond the goal location. Such a behavior could stem from leaky integration of unbiased self-motion velocity estimates or from a prior expectation favoring slower speeds that causes velocity underestimation. Testing both alternatives using a probabilistic framework that maximizes expected reward, we found that subjects' biases were better explained by a slow-speed prior than imperfect integration. When subjects integrate paths over long periods, this framework intriguingly predicts a distance-dependent bias reversal due to buildup of uncertainty, which we also confirmed experimentally. These results suggest that visual path integration in noisy environments is limited largely by biases in processing optic flow rather than by leaky integration. Copyright © 2018 Elsevier Inc. All rights reserved.
Modular Toolkit for Data Processing (MDP): A Python Data Processing Framework.
Zito, Tiziano; Wilbert, Niko; Wiskott, Laurenz; Berkes, Pietro
2008-01-01
Modular toolkit for Data Processing (MDP) is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.
Rater cognition: review and integration of research findings.
Gauthier, Geneviève; St-Onge, Christina; Tavares, Walter
2016-05-01
Given the complexity of competency frameworks, associated skills and abilities, and contexts in which they are to be assessed in competency-based education (CBE), there is an increased reliance on rater judgements when considering trainee performance. This increased dependence on rater-based assessment has led to the emergence of rater cognition as a field of research in health professions education. The topic, however, is often conceptualised and ultimately investigated using many different perspectives and theoretical frameworks. Critically analysing how researchers think about, study and discuss rater cognition or the judgement processes in assessment frameworks may provide meaningful and efficient directions in how the field continues to explore the topic. We conducted a critical and integrative review of the literature to explore common conceptualisations and unified terminology associated with rater cognition research. We identified 1045 articles on rater-based assessment in health professions education using Scorpus, Medline and ERIC and 78 articles were included in our review. We propose a three-phase framework of observation, processing and integration. We situate nine specific mechanisms and sub-mechanisms described across the literature within these phases: (i) generating automatic impressions about the person; (ii) formulating high-level inferences; (iii) focusing on different dimensions of competencies; (iv) categorising through well-developed schemata based on (a) personal concept of competence, (b) comparison with various exemplars and (c) task and context specificity; (v) weighting and synthesising information differently, (vi) producing narrative judgements; and (vii) translating narrative judgements into scales. Our review has allowed us to identify common underlying conceptualisations of observed rater mechanisms and subsequently propose a comprehensive, although complex, framework for the dynamic and contextual nature of the rating process. This framework could help bridge the gap between researchers adopting different perspectives when studying rater cognition and enable the interpretation of contradictory findings of raters' performance by determining which mechanism is enabled or disabled in any given context. © 2016 John Wiley & Sons Ltd.
A Framework for Integrating Environmental Justice in Regulatory Analysis
Nweke, Onyemaechi C.
2011-01-01
With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
ERIC Educational Resources Information Center
Cheong, Choo Mui; Zhu, Xinhua; Liao, Xian
2018-01-01
In recent decades, integrated language competence has been highlighted in the language curricula taught in schools and institutions, and the relationship between test-takers' performance on integrated tasks and comprehension sources has been much studied. The current study employed the frameworks of reading and listening comprehension processes to…
Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R
2017-01-01
Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.
A road map for integrating eco-evolutionary processes into biodiversity models.
Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique
2013-05-01
The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.
Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations
NASA Astrophysics Data System (ADS)
Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.
2010-11-01
We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.
Processing Solutions for Big Data in Astronomy
NASA Astrophysics Data System (ADS)
Fillatre, L.; Lepiller, D.
2016-09-01
This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.
Decision support system based on DPSIR framework for a low flow Mediterranean river basin
NASA Astrophysics Data System (ADS)
Bangash, Rubab Fatima; Kumar, Vikas; Schuhmacher, Marta
2013-04-01
The application of decision making practices are effectively enhanced by adopting a procedural approach setting out a general methodological framework within which specific methods, models and tools can be integrated. Integrated Catchment Management is a process that recognizes the river catchment as a basic organizing unit for understanding and managing ecosystem process. Decision support system becomes more complex by considering unavoidable human activities within a catchment that are motivated by multiple and often competing criteria and/or constraints. DPSIR is a causal framework for describing the interactions between society and the environment. This framework has been adopted by the European Environment Agency and the components of this model are: Driving forces, Pressures, States, Impacts and Responses. The proposed decision support system is a two step framework based on DPSIR. Considering first three component of DPSIR, Driving forces, Pressures and States, hydrological and ecosystem services models are developed. The last two components, Impact and Responses, helped to develop Bayesian Network to integrate the models. This decision support system also takes account of social, economic and environmental aspects. A small river of Catalonia (Northeastern Spain), Francoli River with a low flow (~2 m3/s) is selected for integration of catchment assessment models and to improve knowledge transfer from research to the stakeholders with a view to improve decision making process. DHI's MIKE BASIN software is used to evaluate the low-flow Francolí River with respect to the water bodies' characteristics and also to assess the impact of human activities aiming to achieve good water status for all waters to comply with the WFD's River Basin Management Plan. Based on ArcGIS, MIKE BASIN is a versatile decision support tool that provides a simple and powerful framework for managers and stakeholders to address multisectoral allocation and environmental issues in river basins. While InVEST is a spatially explicit tool, used to model and map a suite of ecosystem services caused by land cover changes or climate change impacts. Moreover, results obtained from low-flow hydrological simulation and ecosystem services models serves as useful tools to develop decision support system based on DPSIR framework by integrating models. Bayesian Networks is used as a knowledge integration and visualization tool to summarize the outcomes of hydrological and ecosystem services models at the "Response" stage of DPSIR. Bayesian Networks provide a framework for modelling the logical relationship between catchment variables and decision objectives by quantifying the strength of these relationships using conditional probabilities. Participatory nature of this framework can provide better communication of water research, particularly in the context of a perceived lack of future awareness-raising with the public that helps to develop more sustainable water management strategies. Acknowledgements The present study was financially supported by Spanish Ministry of Economy and Competitiveness for its financial support through the project SCARCE (Consolider-Ingenio 2010 CSD2009-00065). R. F. Bangash also received PhD fellowship from AGAUR (Commissioner for Universities and Research of the Department of Innovation, Universities and Enterprise of the "Generalitat de Catalunya" and the European Social Fund).
Integrating the New Immigrant: A Model for Social Work Practice in Transitional States
ERIC Educational Resources Information Center
Golan, Naomi; Gruschka, Ruth
1971-01-01
The authors of this paper cast the process of immigration in the prevention intervention framework and offer a model for activity in six key areas: income management, health, housing, education, leisure time activities, and citizenship, by which the integration absorption crisis can be successfully resolved. (Author)
Leisure Counseling: An Antidote for "The Living Death."
ERIC Educational Resources Information Center
Liptak, John J.
1991-01-01
Reviews the process of unemployment and the benefits of leisure, as well as proposes a broader conceptual framework for integrating leisure with work as a means of assisting unemployed workers to cope with unemployment. Notes that, by viewing leisure as integral component of individual's career, employment counselors can more effectively assist…
ERIC Educational Resources Information Center
American Civil Liberties Union, New York, NY.
This annotated bibliography provides a framework within which questions and answers about the school desegregation process can be formulated and addressed. A glossary of terms dealing with school integration are included. Among these are the following: ability grouping, annexation, bilingual education, clustering, consolidation, de facto and de…
Empathy: An Integral Model in the Counseling Process
ERIC Educational Resources Information Center
Clark, Arthur J.
2010-01-01
Expanding on a framework introduced by Carl Rogers, an integral model of empathy in counseling uses empathic understanding through 3 ways of knowing: Subjective empathy enables a counselor to momentarily experience what it is like to be a client, interpersonal empathy relates to understanding a client's phenomenological experiencing, and objective…
Integrated STEM: A New Primer for Teaching Technology Education
ERIC Educational Resources Information Center
Asunda, Paul A.; Mativo, John
2017-01-01
Part One of this article ("Technology and Engineering Teacher," 75(4), December/January, 2016) presented a process that science, math, engineering, and technology teachers could use to collaborate and design integrated STEM courses. A conceptual framework was discussed that could provide a premise that educators interested in delivery of…
NexGen PVAs: Incorporating Eco-Evolutionary Processes into Population Viability Models
We examine how the integration of evolutionary and ecological processes in population dynamics – an emerging framework in ecology – could be incorporated into population viability analysis (PVA). Driven by parallel, complementary advances in population genomics and computational ...
Developing a framework for a toolkit for carbon footprint that integrates transit (CFIT).
DOT National Transportation Integrated Search
2010-11-01
The purpose of this research was to evaluate five transportation planning processes used in Florida to determine how greenhouse gas (GHG) emissions considerations can be incorporated into the processes. These included the federal metropolitan plannin...
NASA Astrophysics Data System (ADS)
El-Wardany, Tahany; Lynch, Mathew; Gu, Wenjiong; Hsu, Arthur; Klecka, Michael; Nardi, Aaron; Viens, Daniel
This paper proposes an optimization framework enabling the integration of multi-scale / multi-physics simulation codes to perform structural optimization design for additively manufactured components. Cold spray was selected as the additive manufacturing (AM) process and its constraints were identified and included in the optimization scheme. The developed framework first utilizes topology optimization to maximize stiffness for conceptual design. The subsequent step applies shape optimization to refine the design for stress-life fatigue. The component weight was reduced by 20% while stresses were reduced by 75% and the rigidity was improved by 37%. The framework and analysis codes were implemented using Altair software as well as an in-house loading code. The optimized design was subsequently produced by the cold spray process.
Old Assumptions, New Paradigms: Technology, Group Process, and Continuing Professional Education.
ERIC Educational Resources Information Center
Healey, Kathryn N.; Lawler, Patricia A.
2002-01-01
Continuing educators must consider the impact of technology on group processes, including ways in which it affects group pressures, communication patterns, and social and emotional components of learning. Administrators and faculty should integrate group process frameworks with educational technologies in order to provide effective learning…
NASA Astrophysics Data System (ADS)
Dietrich, Jörg; Funke, Markus
Integrated water resources management (IWRM) redefines conventional water management approaches through a closer cross-linkage between environment and society. The role of public participation and socio-economic considerations becomes more important within the planning and decision making process. In this paper we address aspects of the integration of catchment models into such a process taking the implementation of the European Water Framework Directive (WFD) as an example. Within a case study situated in the Werra river basin (Central Germany), a systems analytic decision process model was developed. This model uses the semantics of the Unified Modeling Language (UML) activity model. As an example application, the catchment model SWAT and the water quality model RWQM1 were applied to simulate the effect of phosphorus emissions from non-point and point sources on water quality. The decision process model was able to guide the participants of the case study through the interdisciplinary planning and negotiation of actions. Further improvements of the integration framework include tools for quantitative uncertainty analyses, which are crucial for real life application of models within an IWRM decision making toolbox. For the case study, the multi-criteria assessment of actions indicates that the polluter pays principle can be met at larger scales (sub-catchment or river basin) without significantly compromising cost efficiency for the local situation.
Owens, Elizabeth Oesterling; Patel, Molini M; Kirrane, Ellen; Long, Thomas C; Brown, James; Cote, Ila; Ross, Mary A; Dutton, Steven J
2017-08-01
To inform regulatory decisions on the risk due to exposure to ambient air pollution, consistent and transparent communication of the scientific evidence is essential. The United States Environmental Protection Agency (U.S. EPA) develops the Integrated Science Assessment (ISA), which contains evaluations of the policy-relevant science on the effects of criteria air pollutants and conveys critical science judgments to inform decisions on the National Ambient Air Quality Standards. This article discusses the approach and causal framework used in the ISAs to evaluate and integrate various lines of scientific evidence and draw conclusions about the causal nature of air pollution-induced health effects. The framework has been applied to diverse pollutants and cancer and noncancer effects. To demonstrate its flexibility, we provide examples of causality judgments on relationships between health effects and pollutant exposures, drawing from recent ISAs for ozone, lead, carbon monoxide, and oxides of nitrogen. U.S. EPA's causal framework has increased transparency by establishing a structured process for evaluating and integrating various lines of evidence and uniform approach for determining causality. The framework brings consistency and specificity to the conclusions in the ISA, and the flexibility of the framework makes it relevant for evaluations of evidence across media and health effects. Published by Elsevier Inc.
Sather, Mike R; Parsons, Sherry; Boardman, Kathy D; Warren, Stuart R; Davis-Karim, Anne; Griffin, Kevin; Betterton, Jane A; Jones, Mark S; Johnson, Stanley H; Vertrees, Julia E; Hickey, Jan H; Salazar, Thelma P; Huang, Grant D
2018-03-01
This paper presents the quality journey taken by a Federal organization over more than 20 years. These efforts have resulted in the implementation of a Total Integrated Performance Excellence System (TIPES) that combines key principles and practices of established quality systems. The Center has progressively integrated quality system frameworks including the Malcom Baldrige National Quality Award (MBNQA) Framework and Criteria for Performance Excellence, ISO 9001, and the Organizational Project Management Maturity Model (OPM3), as well as supplemental quality systems of ISO 15378 (packaging for medicinal products) and ISO 21500 (guide to project management) to systematically improve all areas of operations. These frameworks were selected for applicability to Center processes and systems, consistency and reinforcement of complimentary approaches, and international acceptance. External validations include the MBNQA, the highest quality award in the US, continued registration and conformance to ISO standards and guidelines, and multiple VA and state awards. With a focus on a holistic approach to quality involving processes, systems and personnel, this paper presents activities and lessons that were critical to building TIPES and establishing the quality environment for conducting clinical research in support of Veterans and national health care.
Whole systems shared governance: a model for the integrated health system.
Evan, K; Aubry, K; Hawkins, M; Curley, T A; Porter-O'Grady, T
1995-05-01
The healthcare system is under renovation and renewal. In the process, roles and structures are shifting to support a subscriber-based continuum of care. Alliances and partnerships are emerging as the models of integration for the future. But how do we structure to support these emerging integrated partnerships? As the nurse executive expands the role and assumes increasing responsibility for creating new frameworks for care, a structure that sustains the point-of-care innovations and interdisciplinary relationships must be built. Whole systems models of organization, such as shared governance, are expanding as demand grows for a sustainable structure for horizontal and partnered systems of healthcare delivery. The executive will have to apply these newer frameworks to the delivery of care to provide adequate support for the clinically integrated environment.
Description of the U.S. Geological Survey Geo Data Portal data integration framework
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Lucido, Jessica M.
2012-01-01
The U.S. Geological Survey has developed an open-standard data integration framework for working efficiently and effectively with large collections of climate and other geoscience data. A web interface accesses catalog datasets to find data services. Data resources can then be rendered for mapping and dataset metadata are derived directly from these web services. Algorithm configuration and information needed to retrieve data for processing are passed to a server where all large-volume data access and manipulation takes place. The data integration strategy described here was implemented by leveraging existing free and open source software. Details of the software used are omitted; rather, emphasis is placed on how open-standard web services and data encodings can be used in an architecture that integrates common geographic and atmospheric data.
Lai, Chin-Feng; Chen, Min; Pan, Jeng-Shyang; Youn, Chan-Hyun; Chao, Han-Chieh
2014-03-01
As cloud computing and wireless body sensor network technologies become gradually developed, ubiquitous healthcare services prevent accidents instantly and effectively, as well as provides relevant information to reduce related processing time and cost. This study proposes a co-processing intermediary framework integrated cloud and wireless body sensor networks, which is mainly applied to fall detection and 3-D motion reconstruction. In this study, the main focuses includes distributed computing and resource allocation of processing sensing data over the computing architecture, network conditions and performance evaluation. Through this framework, the transmissions and computing time of sensing data are reduced to enhance overall performance for the services of fall events detection and 3-D motion reconstruction.
Multiobjective optimization of temporal processes.
Song, Zhe; Kusiak, Andrew
2010-06-01
This paper presents a dynamic predictive-optimization framework of a nonlinear temporal process. Data-mining (DM) and evolutionary strategy algorithms are integrated in the framework for solving the optimization model. DM algorithms learn dynamic equations from the process data. An evolutionary strategy algorithm is then applied to solve the optimization problem guided by the knowledge extracted by the DM algorithm. The concept presented in this paper is illustrated with the data from a power plant, where the goal is to maximize the boiler efficiency and minimize the limestone consumption. This multiobjective optimization problem can be either transformed into a single-objective optimization problem through preference aggregation approaches or into a Pareto-optimal optimization problem. The computational results have shown the effectiveness of the proposed optimization framework.
ERIC Educational Resources Information Center
Velliaris, Donna M.; Breen, Paul
2016-01-01
In this paper, the authors explore a holistic three-stage framework currently used by the Eynesbury Institute of Business and Technology (EIBT), focused on academic staff identification and remediation processes for the prevention of (un)intentional student plagiarism. As a pre-university pathway provider--whose student body is 98%…
NASA Technical Reports Server (NTRS)
Miller, R. E., Jr.; Hansen, S. D.; Redhed, D. D.; Southall, J. W.; Kawaguchi, A. S.
1974-01-01
Evaluation of the cost-effectiveness of integrated analysis/design systems with particular attention to Integrated Program for Aerospace-Vehicle Design (IPAD) project. An analysis of all the ingredients of IPAD indicates the feasibility of a significant cost and flowtime reduction in the product design process involved. It is also concluded that an IPAD-supported design process will provide a framework for configuration control, whereby the engineering costs for design, analysis and testing can be controlled during the air vehicle development cycle.
Understanding the Role of Numeracy in Health: Proposed Theoretical Framework and Practical Insights
Lipkus, Isaac M.; Peters, Ellen
2009-01-01
Numeracy, that is how facile people are with mathematical concepts and their applications, is gaining importance in medical decision making and risk communication. This paper proposes six critical functions of health numeracy. These functions are integrated into a theoretical framework on health numeracy that has implications for risk-communication and medical-decision-making processes. We examine practical underpinnings for targeted interventions aimed at improving such processes as a function of health numeracy. It is hoped that the proposed functions and theoretical framework will spur more research to determine how an understanding of health numeracy can lead to more effective communication and decision outcomes. PMID:19834054
How to practice person-centred care: A conceptual framework.
Santana, Maria J; Manalili, Kimberly; Jolley, Rachel J; Zelinsky, Sandra; Quan, Hude; Lu, Mingshan
2018-04-01
Globally, health-care systems and organizations are looking to improve health system performance through the implementation of a person-centred care (PCC) model. While numerous conceptual frameworks for PCC exist, a gap remains in practical guidance on PCC implementation. Based on a narrative review of the PCC literature, a generic conceptual framework was developed in collaboration with a patient partner, which synthesizes evidence, recommendations and best practice from existing frameworks and implementation case studies. The Donabedian model for health-care improvement was used to classify PCC domains into the categories of "Structure," "Process" and "Outcome" for health-care quality improvement. The framework emphasizes the structural domain, which relates to the health-care system or context in which care is delivered, providing the foundation for PCC, and influencing the processes and outcomes of care. Structural domains identified include: the creation of a PCC culture across the continuum of care; co-designing educational programs, as well as health promotion and prevention programs with patients; providing a supportive and accommodating environment; and developing and integrating structures to support health information technology and to measure and monitor PCC performance. Process domains describe the importance of cultivating communication and respectful and compassionate care; engaging patients in managing their care; and integration of care. Outcome domains identified include: access to care and Patient-Reported Outcomes. This conceptual framework provides a step-wise roadmap to guide health-care systems and organizations in the provision PCC across various health-care sectors. © 2017 The Authors Health Expectations published by John Wiley & Sons Ltd.
Morse, Wayde C; Hall, Troy E; Kruger, Linda E
2009-03-01
In this article, we examine how issues of scale affect the integration of recreation management with the management of other natural resources on public lands. We present two theories used to address scale issues in ecology and explore how they can improve the two most widely applied recreation-planning frameworks. The theory of patch dynamics and hierarchy theory are applied to the recreation opportunity spectrum (ROS) and the limits of acceptable change (LAC) recreation-planning frameworks. These frameworks have been widely adopted internationally, and improving their ability to integrate with other aspects of natural resource management has significant social and conservation implications. We propose that incorporating ecologic criteria and scale concepts into these recreation-planning frameworks will improve the foundation for integrated land management by resolving issues of incongruent boundaries, mismatched scales, and multiple-scale analysis. Specifically, we argue that whereas the spatially explicit process of the ROS facilitates integrated decision making, its lack of ecologic criteria, broad extent, and large patch size decrease its usefulness for integration at finer scales. The LAC provides explicit considerations for weighing competing values, but measurement of recreation disturbances within an LAC analysis is often done at too fine a grain and at too narrow an extent for integration with other recreation and resource concerns. We suggest that planners should perform analysis at multiple scales when making management decisions that involve trade-offs among competing values. The United States Forest Service is used as an example to discuss how resource-management agencies can improve this integration.
Nanosurveyor: a framework for real-time data processing
Daurer, Benedikt J.; Krishnan, Hari; Perciano, Talita; ...
2017-01-31
Background: The ever improving brightness of accelerator based sources is enabling novel observations and discoveries with faster frame rates, larger fields of view, higher resolution, and higher dimensionality. Results: Here we present an integrated software/algorithmic framework designed to capitalize on high-throughput experiments through efficient kernels, load-balanced workflows, which are scalable in design. We describe the streamlined processing pipeline of ptychography data analysis. Conclusions: The pipeline provides throughput, compression, and resolution as well as rapid feedback to the microscope operators.
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
NASA Astrophysics Data System (ADS)
Urfianto, Mohammad Zalfany; Isshiki, Tsuyoshi; Khan, Arif Ullah; Li, Dongju; Kunieda, Hiroaki
This paper presentss a Multiprocessor System-on-Chips (MPSoC) architecture used as an execution platform for the new C-language based MPSoC design framework we are currently developing. The MPSoC architecture is based on an existing SoC platform with a commercial RISC core acting as the host CPU. We extend the existing SoC with a multiprocessor-array block that is used as the main engine to run parallel applications modeled in our design framework. Utilizing several optimizations provided by our compiler, an efficient inter-communication between processing elements with minimum overhead is implemented. A host-interface is designed to integrate the existing RISC core to the multiprocessor-array. The experimental results show that an efficacious integration is achieved, proving that the designed communication module can be used to efficiently incorporate off-the-shelf processors as a processing element for MPSoC architectures designed using our framework.
How Does Sexual Minority Stigma “Get Under the Skin”? A Psychological Mediation Framework
Hatzenbuehler, Mark L.
2009-01-01
Sexual minorities are at increased risk for multiple mental health burdens compared to heterosexuals. The field has identified two distinct determinants of this risk, including group-specific minority stressors and general psychological processes that are common across sexual orientations. The goal of the present paper is to develop a theoretical framework that integrates the important insights from these literatures. The framework postulates that (a) sexual minorities confront increased stress exposure resulting from stigma; (b) this stigma-related stress creates elevations in general emotion dysregulation, social/interpersonal problems, and cognitive processes conferring risk for psychopathology; and (c) these processes in turn mediate the relationship between stigma-related stress and psychopathology. It is argued that this framework can, theoretically, illuminate how stigma adversely affects mental health and, practically, inform clinical interventions. Evidence for the predictive validity of this framework is reviewed, with particular attention paid to illustrative examples from research on depression, anxiety, and alcohol use disorders. PMID:19702379
Liao, Chen; Seo, Seung-Oh; Celik, Venhar; Liu, Huaiwei; Kong, Wentao; Wang, Yi; Blaschek, Hans; Jin, Yong-Su; Lu, Ting
2015-07-07
Microbial metabolism involves complex, system-level processes implemented via the orchestration of metabolic reactions, gene regulation, and environmental cues. One canonical example of such processes is acetone-butanol-ethanol (ABE) fermentation by Clostridium acetobutylicum, during which cells convert carbon sources to organic acids that are later reassimilated to produce solvents as a strategy for cellular survival. The complexity and systems nature of the process have been largely underappreciated, rendering challenges in understanding and optimizing solvent production. Here, we present a system-level computational framework for ABE fermentation that combines metabolic reactions, gene regulation, and environmental cues. We developed the framework by decomposing the entire system into three modules, building each module separately, and then assembling them back into an integrated system. During the model construction, a bottom-up approach was used to link molecular events at the single-cell level into the events at the population level. The integrated model was able to successfully reproduce ABE fermentations of the WT C. acetobutylicum (ATCC 824), as well as its mutants, using data obtained from our own experiments and from literature. Furthermore, the model confers successful predictions of the fermentations with various network perturbations across metabolic, genetic, and environmental aspects. From foundation to applications, the framework advances our understanding of complex clostridial metabolism and physiology and also facilitates the development of systems engineering strategies for the production of advanced biofuels.
Liao, Chen; Seo, Seung-Oh; Celik, Venhar; Liu, Huaiwei; Kong, Wentao; Wang, Yi; Blaschek, Hans; Jin, Yong-Su; Lu, Ting
2015-01-01
Microbial metabolism involves complex, system-level processes implemented via the orchestration of metabolic reactions, gene regulation, and environmental cues. One canonical example of such processes is acetone-butanol-ethanol (ABE) fermentation by Clostridium acetobutylicum, during which cells convert carbon sources to organic acids that are later reassimilated to produce solvents as a strategy for cellular survival. The complexity and systems nature of the process have been largely underappreciated, rendering challenges in understanding and optimizing solvent production. Here, we present a system-level computational framework for ABE fermentation that combines metabolic reactions, gene regulation, and environmental cues. We developed the framework by decomposing the entire system into three modules, building each module separately, and then assembling them back into an integrated system. During the model construction, a bottom-up approach was used to link molecular events at the single-cell level into the events at the population level. The integrated model was able to successfully reproduce ABE fermentations of the WT C. acetobutylicum (ATCC 824), as well as its mutants, using data obtained from our own experiments and from literature. Furthermore, the model confers successful predictions of the fermentations with various network perturbations across metabolic, genetic, and environmental aspects. From foundation to applications, the framework advances our understanding of complex clostridial metabolism and physiology and also facilitates the development of systems engineering strategies for the production of advanced biofuels. PMID:26100881
Zhou, Ronggang; Chan, Alan H S
2017-01-01
In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.
Integrating complex business processes for knowledge-driven clinical decision support systems.
Kamaleswaran, Rishikesan; McGregor, Carolyn
2012-01-01
This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.
Espallargues, Mireia; Serra-Sutton, Vicky; Solans-Domènech, Maite; Torrente, Elena; Moharra, Montse; Benítez, Dolors; Robles, Noemí; Domíngo, Laia; Escarrabill Sanglas, Joan
2016-07-07
The aim was to develop a conceptual framework for the assessment of new healthcare initiatives on chronic diseases within the Spanish National Health System. A comprehensive literature review between 2002 and 2013, including systematic reviews, meta-analysis, and reports with evaluation frameworks and/or assessment of initiatives was carried out; integrated care initiatives established in Catalonia were studied and described; and semistructured interviews with key stakeholders were performed. The scope and conceptual framework were defined by using the brainstorming approach.Of 910 abstracts identified, a total of 116 studies were included. They referred to several conceptual frameworks and/or assessment indicators at a national and international level. An overall of 24 established chronic care initiatives were identified (9 integrated care initiatives); 10 in-depth interviews were carried out. The proposed conceptual framework envisages: 1)the target population according to complexity levels; 2)an evaluation approach of the structure, processes, and outcomes considering the health status achieved, the recovery process and the maintenance of health; and 3)the dimensions or attributes to be assessed. The proposed conceptual framework will be helpful has been useful to develop indicators and implement them with a community-based and result-oriented approach and a territorial or population-based perspective within the Spanish Health System. This will be essential to know which are the most effective strategies, what are the key elements that determine greater success and what are the groups of patients who can most benefit.
Master of Puppets: Cooperative Multitasking for In Situ Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Lukic, Zarija
2016-01-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less
Kukona, Anuenue; Tabor, Whitney
2011-01-01
The visual world paradigm presents listeners with a challenging problem: they must integrate two disparate signals, the spoken language and the visual context, in support of action (e.g., complex movements of the eyes across a scene). We present Impulse Processing, a dynamical systems approach to incremental eye movements in the visual world that suggests a framework for integrating language, vision, and action generally. Our approach assumes that impulses driven by the language and the visual context impinge minutely on a dynamical landscape of attractors corresponding to the potential eye-movement behaviors of the system. We test three unique predictions of our approach in an empirical study in the visual world paradigm, and describe an implementation in an artificial neural network. We discuss the Impulse Processing framework in relation to other models of the visual world paradigm. PMID:21609355
Improved ADM1 model for anaerobic digestion process considering physico-chemical reactions.
Zhang, Yang; Piccard, Sarah; Zhou, Wen
2015-11-01
The "Anaerobic Digestion Model No. 1" (ADM1) was modified in the study by improving the bio-chemical framework and integrating a more detailed physico-chemical framework. Inorganic carbon and nitrogen balance terms were introduced to resolve the discrepancies in the original bio-chemical framework between the carbon and nitrogen contents in the degraders and substrates. More inorganic components and solids precipitation processes were included in the physico-chemical framework of ADM1. The modified ADM1 was validated with the experimental data and used to investigate the effects of calcium ions, magnesium ions, inorganic phosphorus and inorganic nitrogen on anaerobic digestion in batch reactor. It was found that the entire anaerobic digestion process might exist an optimal initial concentration of inorganic nitrogen for methane gas production in the presence of calcium ions, magnesium ions and inorganic phosphorus. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multi-tasking arbitration and behaviour design for human-interactive robots
NASA Astrophysics Data System (ADS)
Kobayashi, Yuichi; Onishi, Masaki; Hosoe, Shigeyuki; Luo, Zhiwei
2013-05-01
Robots that interact with humans in household environments are required to handle multiple real-time tasks simultaneously, such as carrying objects, collision avoidance and conversation with human. This article presents a design framework for the control and recognition processes to meet these requirements taking into account stochastic human behaviour. The proposed design method first introduces a Petri net for synchronisation of multiple tasks. The Petri net formulation is converted to Markov decision processes and processed in an optimal control framework. Three tasks (safety confirmation, object conveyance and conversation) interact and are expressed by the Petri net. Using the proposed framework, tasks that normally tend to be designed by integrating many if-then rules can be designed in a systematic manner in a state estimation and optimisation framework from the viewpoint of the shortest time optimal control. The proposed arbitration method was verified by simulations and experiments using RI-MAN, which was developed for interactive tasks with humans.
Two frameworks for integrating knowledge in induction
NASA Technical Reports Server (NTRS)
Rosenbloom, Paul S.; Hirsh, Haym; Cohen, William W.; Smith, Benjamin D.
1994-01-01
The use of knowledge in inductive learning is critical for improving the quality of the concept definitions generated, reducing the number of examples required in order to learn effective concept definitions, and reducing the computation needed to find good concept definitions. Relevant knowledge may come in many forms (such as examples, descriptions, advice, and constraints) and from many sources (such as books, teachers, databases, and scientific instruments). How to extract the relevant knowledge from this plethora of possibilities, and then to integrate it together so as to appropriately affect the induction process is perhaps the key issue at this point in inductive learning. Here the focus is on the integration part of this problem; that is, how induction algorithms can, and do, utilize a range of extracted knowledge. Preliminary work on a transformational framework for defining knowledge-intensive inductive algorithms out of relatively knowledge-free algorithms is described, as is a more tentative problems-space framework that attempts to cover all induction algorithms within a single general approach. These frameworks help to organize what is known about current knowledge-intensive induction algorithms, and to point towards new algorithms.
Harnagea, Hermina; Lamothe, Lise; Couturier, Yves; Esfandiari, Shahrokh; Voyer, René; Charbonneau, Anne; Emami, Elham
2018-02-15
Despite its importance, the integration of oral health into primary care is still an emerging practice in the field of health care services. This scoping review aims to map the literature and provide a summary on the conceptual frameworks, policies and programs related to this concept. Using the Levac et al. six-stage framework, we performed a systematic search of electronic databases, organizational websites and grey literature from 1978 to April 2016. All relevant original publications with a focus on the integration of oral health into primary care were retrieved. Content analyses were performed to synthesize the results. From a total of 1619 citations, 67 publications were included in the review. Two conceptual frameworks were identified. Policies regarding oral heath integration into primary care were mostly oriented toward common risk factors approach and care coordination processes. In general, oral health integrated care programs were designed in the public health sector and based on partnerships with various private and public health organizations, governmental bodies and academic institutions. These programmes used various strategies to empower oral health integrated care, including building interdisciplinary networks, training non-dental care providers, oral health champion modelling, enabling care linkages and care coordinated process, as well as the use of e-health technologies. The majority of studies on the programs outcomes were descriptive in nature without reporting long-term outcomes. This scoping review provided a comprehensive overview on the concept of integration of oral health in primary care. The findings identified major gaps in reported programs outcomes mainly because of the lack of related research. However, the results could be considered as a first step in the development of health care policies that support collaborative practices and patient-centred care in the field of primary care sector.
Liyanage, H; Liaw, S-T; Di Iorio, C T; Kuziemsky, C; Schreiber, R; Terry, A L; de Lusignan, S
2016-11-10
Privacy, ethics, and data access issues pose significant challenges to the timely delivery of health research. Whilst the fundamental drivers to ensure that data access is ethical and satisfies privacy requirements are similar, they are often dealt with in varying ways by different approval processes. To achieve a consensus across an international panel of health care and informatics professionals on an integrated set of privacy and ethics principles that could accelerate health data access in data-driven health research projects. A three-round consensus development process was used. In round one, we developed a baseline framework for privacy, ethics, and data access based on a review of existing literature in the health, informatics, and policy domains. This was further developed using a two-round Delphi consensus building process involving 20 experts who were members of the International Medical Informatics Association (IMIA) and European Federation of Medical Informatics (EFMI) Primary Health Care Informatics Working Groups. To achieve consensus we required an extended Delphi process. The first round involved feedback on and development of the baseline framework. This consisted of four components: (1) ethical principles, (2) ethical guidance questions, (3) privacy and data access principles, and (4) privacy and data access guidance questions. Round two developed consensus in key areas of the revised framework, allowing the building of a newly, more detailed and descriptive framework. In the final round panel experts expressed their opinions, either as agreements or disagreements, on the ethics and privacy statements of the framework finding some of the previous round disagreements to be surprising in view of established ethical principles. This study develops a framework for an integrated approach to ethics and privacy. Privacy breech risk should not be considered in isolation but instead balanced by potential ethical benefit.
Miyoshi, Newton Shydeo Brandão; Pinheiro, Daniel Guariz; Silva, Wilson Araújo; Felipe, Joaquim Cezar
2013-06-06
The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. We have implemented an extension of Chado - the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different "omics" technologies with patient's clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badwan, Faris M.; Demuth, Scott F
Department of Energy’s Office of Nuclear Energy, Fuel Cycle Research and Development develops options to the current commercial fuel cycle management strategy to enable the safe, secure, economic, and sustainable expansion of nuclear energy while minimizing proliferation risks by conducting research and development focused on used nuclear fuel recycling and waste management to meet U.S. needs. Used nuclear fuel is currently stored onsite in either wet pools or in dry storage systems, with disposal envisioned in interim storage facility and, ultimately, in a deep-mined geologic repository. The safe management and disposition of used nuclear fuel and/or nuclear waste is amore » fundamental aspect of any nuclear fuel cycle. Integrating safety, security, and safeguards (3Ss) fully in the early stages of the design process for a new nuclear facility has the potential to effectively minimize safety, proliferation, and security risks. The 3Ss integration framework could become the new national and international norm and the standard process for designing future nuclear facilities. The purpose of this report is to develop a framework for integrating the safety, security and safeguards concept into the design of Used Nuclear Fuel Storage Facility (UNFSF). The primary focus is on integration of safeguards and security into the UNFSF based on the existing Nuclear Regulatory Commission (NRC) approach to addressing the safety/security interface (10 CFR 73.58 and Regulatory Guide 5.73) for nuclear power plants. The methodology used for adaptation of the NRC safety/security interface will be used as the basis for development of the safeguards /security interface and later will be used as the basis for development of safety and safeguards interface. Then this will complete the integration cycle of safety, security, and safeguards. The overall methodology for integration of 3Ss will be proposed, but only the integration of safeguards and security will be applied to the design of the UNFSF. The framework for integration of safeguards and security into the UNFSF will include 1) identification of applicable regulatory requirements, 2) selection of a common system that share dual safeguard and security functions, 3) development of functional design criteria and design requirements for the selected system, 4) identification and integration of the dual safeguards and security design requirements, and 5) assessment of the integration and potential benefit.« less
Near real-time, on-the-move software PED using VPEF
NASA Astrophysics Data System (ADS)
Green, Kevin; Geyer, Chris; Burnette, Chris; Agarwal, Sanjeev; Swett, Bruce; Phan, Chung; Deterline, Diane
2015-05-01
The scope of the Micro-Cloud for Operational, Vehicle-Based EO-IR Reconnaissance System (MOVERS) development effort, managed by the Night Vision and Electronic Sensors Directorate (NVESD), is to develop, integrate, and demonstrate new sensor technologies and algorithms that improve improvised device/mine detection using efficient and effective exploitation and fusion of sensor data and target cues from existing and future Route Clearance Package (RCP) sensor systems. Unfortunately, the majority of forward looking Full Motion Video (FMV) and computer vision processing, exploitation, and dissemination (PED) algorithms are often developed using proprietary, incompatible software. This makes the insertion of new algorithms difficult due to the lack of standardized processing chains. In order to overcome these limitations, EOIR developed the Government off-the-shelf (GOTS) Video Processing and Exploitation Framework (VPEF) to be able to provide standardized interfaces (e.g., input/output video formats, sensor metadata, and detected objects) for exploitation software and to rapidly integrate and test computer vision algorithms. EOIR developed a vehicle-based computing framework within the MOVERS and integrated it with VPEF. VPEF was further enhanced for automated processing, detection, and publishing of detections in near real-time, thus improving the efficiency and effectiveness of RCP sensor systems.
An integrative neuroscience model of "significance" processing.
Williams, Leanne M
2006-03-01
The Gordon [37-40] framework of Integrative Neuroscience is used to develop a continuum model for understanding the central role of motivationally-determined "significance" in organizing human information processing. Significance is defined as the property which gives a stimulus relevance to our core motivation to minimize danger and maximize pleasure. Within this framework, the areas of cognition and emotion, theories of motivational arousal and orienting, and the current understanding of neural systems are brought together. The basis of integration is a temporal continuum in which significance processing extends from the most rapid millisecond time scale of automatic, nonconscious mechanisms to the time scale of seconds, in which memory is shaped, to the controlled and conscious mechanisms unfolding over minutes. Over this continuum, significant stimuli are associated with a spectrum of defensive (or consumptive) behaviors through to volitional regulatory behaviors for danger (versus pleasure) and associated brainstem, limbic, medial forebrain bundle and prefrontal circuits, all of which reflect a balance of excitatory (predominant at rapid time scales) to inhibitory mechanisms. Across the lifespan, the negative and positive outcomes of significance processing, coupled with constitutional and genetic factors, will contribute to plasticity, shaping individual adaptations and maladaptions in the balance of excitatory-inhibitory mechanisms.
Integration of a Self-Coherence Algorithm into DISAT for Forced Oscillation Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Follum, James D.; Tuffner, Francis K.; Amidan, Brett G.
2015-03-03
With the increasing number of phasor measurement units on the power system, behaviors typically not observable on the power system are becoming more apparent. Oscillatory behavior on the power system, notably forced oscillations, are one such behavior. However, the large amounts of data coming from the PMUs makes manually detecting and locating these oscillations difficult. To automate portions of the process, an oscillation detection routine was coded into the Data Integrity and Situational Awareness Tool (DISAT) framework. Integration into the DISAT framework allows forced oscillations to be detected and information about the event provided to operational engineers. The oscillation detectionmore » algorithm integrates with the data handling and atypical data detecting capabilities of DISAT, building off of a standard library of functions. This report details that integration with information on the algorithm, some implementation issues, and some sample results from the western United States’ power grid.« less
Continuous integration for concurrent MOOSE framework and application development on GitHub
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...
2015-11-20
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Continuous integration for concurrent MOOSE framework and application development on GitHub
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
A Metacommunity Framework for Enhancing the Effectiveness of Biological Monitoring Strategies
Roque, Fabio O.; Cottenie, Karl
2012-01-01
Because of inadequate knowledge and funding, the use of biodiversity indicators is often suggested as a way to support management decisions. Consequently, many studies have analyzed the performance of certain groups as indicator taxa. However, in addition to knowing whether certain groups can adequately represent the biodiversity as a whole, we must also know whether they show similar responses to the main structuring processes affecting biodiversity. Here we present an application of the metacommunity framework for evaluating the effectiveness of biodiversity indicators. Although the metacommunity framework has contributed to a better understanding of biodiversity patterns, there is still limited discussion about its implications for conservation and biomonitoring. We evaluated the effectiveness of indicator taxa in representing spatial variation in macroinvertebrate community composition in Atlantic Forest streams, and the processes that drive this variation. We focused on analyzing whether some groups conform to environmental processes and other groups are more influenced by spatial processes, and on how this can help in deciding which indicator group or groups should be used. We showed that a relatively small subset of taxa from the metacommunity would represent 80% of the variation in community composition shown by the entire metacommunity. Moreover, this subset does not have to be composed of predetermined taxonomic groups, but rather can be defined based on random subsets. We also found that some random subsets composed of a small number of genera performed better in responding to major environmental gradients. There were also random subsets that seemed to be affected by spatial processes, which could indicate important historical processes. We were able to integrate in the same theoretical and practical framework, the selection of biodiversity surrogates, indicators of environmental conditions, and more importantly, an explicit integration of environmental and spatial processes into the selection approach. PMID:22937068
Renn, Jürgen
2015-01-01
ABSTRACT This paper introduces a conceptual framework for the evolution of complex systems based on the integration of regulatory network and niche construction theories. It is designed to apply equally to cases of biological, social and cultural evolution. Within the conceptual framework we focus especially on the transformation of complex networks through the linked processes of externalization and internalization of causal factors between regulatory networks and their corresponding niches and argue that these are an important part of evolutionary explanations. This conceptual framework extends previous evolutionary models and focuses on several challenges, such as the path‐dependent nature of evolutionary change, the dynamics of evolutionary innovation and the expansion of inheritance systems. J. Exp. Zool. (Mol. Dev. Evol.) 324B: 565–577, 2015. © 2015 The Authors. Journal of Experimental Zoology Part B: Molecular and Developmental Evolution published by Wiley Periodicals, Inc. PMID:26097188
Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework
NASA Astrophysics Data System (ADS)
Wang, C.; Hu, F.; Sha, D.; Han, X.
2017-10-01
Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.
Enterprise and system of systems capability development life-cycle processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, David Franklin
2014-08-01
This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less
Measuring the Impact of Data Mining on Churn Management.
ERIC Educational Resources Information Center
Lejeune, Miguel A. P. M.
2001-01-01
Churn management is a concern for businesses, particularly in the digital economy. A customer relationship framework is proposed to help deal with churn issues. The model integrates the electronic channel and involves four tools for enhancing data collection, data treatment, data analysis and data integration in the decision-making process.…
ERIC Educational Resources Information Center
Khuong, Cam Thi Hong
2016-01-01
This paper addresses the work-integrated learning (WIL) initiative embedded in selected tourism training programs in Vietnam. The research was grounded on the framework of stakeholder ethos. Drawing on tourism training curriculum analysis and interviews with lecturers, institutional leaders, industry managers and internship supervisors, this study…
Integration of Wireless Technologies in Smart University Campus Environment: Framework Architecture
ERIC Educational Resources Information Center
Khamayseh, Yaser; Mardini, Wail; Aljawarneh, Shadi; Yassein, Muneer Bani
2015-01-01
In this paper, the authors are particularly interested in enhancing the education process by integrating new tools to the teaching environments. This enhancement is part of an emerging concept, called smart campus. Smart University Campus will come up with a new ubiquitous computing and communication field and change people's lives radically by…
A Framework for Mobile Apps in Colleges and Universities: Data Mining Perspective
ERIC Educational Resources Information Center
Singh, Archana; Ranjan, Jayanthi
2016-01-01
The Enterprise mobility communication technology provides easy and quick accessibility to data and information integrated into one single touch point device. This device incorporates or integrates all the processes into small applications or App and thus increases the workforce capability of knowledge workers. "App" which is a small set…
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2011 CFR
2011-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2012 CFR
2012-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2013 CFR
2013-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
49 CFR 192.907 - What must an operator do to implement this subpart?
Code of Federal Regulations, 2014 CFR
2014-10-01
... management program must consist, at a minimum, of a framework that describes the process for implementing... Transmission Pipeline Integrity Management § 192.907 What must an operator do to implement this subpart? (a... follow a written integrity management program that contains all the elements described in § 192.911 and...
Contandriopoulos, Damien; Brousselle, Astrid; Dubois, Carl-Ardy; Perroux, Mélanie; Beaulieu, Marie-Dominique; Brault, Isabelle; Kilpatrick, Kelley; D'Amour, Danielle; Sansgter-Gormley, Esther
2015-02-27
Integrating Nurse Practitioners into primary care teams is a process that involves significant challenges. To be successful, nurse practitioner integration into primary care teams requires, among other things, a redefinition of professional boundaries, in particular those of medicine and nursing, a coherent model of inter- and intra- professional collaboration, and team-based work processes that make the best use of the subsidiarity principle. There have been numerous studies on nurse practitioner integration, and the literature provides a comprehensive list of barriers to, and facilitators of, integration. However, this literature is much less prolific in discussing the operational level implications of those barriers and facilitators and in offering practical recommendations. In the context of a large-scale research project on the introduction of nurse practitioners in Quebec (Canada) we relied on a logic-analysis approach based, on the one hand on a realist review of the literature and, on the other hand, on qualitative case-studies in 6 primary healthcare teams in rural and urban area of Quebec. Five core themes that need to be taken into account when integrating nurse practitioners into primary care teams were identified. Those themes are: planning, role definition, practice model, collaboration, and team support. The present paper has two objectives: to present the methods used to develop the themes, and to discuss an integrative model of nurse practitioner integration support centered around these themes. It concludes with a discussion of how this framework contributes to existing knowledge and some ideas for future avenues of study.
Towards a Cloud Based Smart Traffic Management Framework
NASA Astrophysics Data System (ADS)
Rahimi, M. M.; Hakimpour, F.
2017-09-01
Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.
Marenco, Luis N.; Wang, Rixin; Bandrowski, Anita E.; Grethe, Jeffrey S.; Shepherd, Gordon M.; Miller, Perry L.
2014-01-01
This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF’s data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO’s current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation. PMID:25018728
Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L
2014-01-01
This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.
path integral approach to closed form pricing formulas in the Heston framework.
NASA Astrophysics Data System (ADS)
Lemmens, Damiaan; Wouters, Michiel; Tempere, Jacques; Foulon, Sven
2008-03-01
We present a path integral approach for finding closed form formulas for option prices in the framework of the Heston model. The first model for determining option prices was the Black-Scholes model, which assumed that the logreturn followed a Wiener process with a given drift and constant volatility. To provide a realistic description of the market, the Black-Scholes results must be extended to include stochastic volatility. This is achieved by the Heston model, which assumes that the volatility follows a mean reverting square root process. Current applications of the Heston model are hampered by the unavailability of fast numerical methods, due to a lack of closed-form formulae. Therefore the search for closed form solutions is an essential step before the qualitatively better stochastic volatility models will be used in practice. To attain this goal we outline a simplified path integral approach yielding straightforward results for vanilla Heston options with correlation. Extensions to barrier options and other path-dependent option are discussed, and the new derivation is compared to existing results obtained from alternative path-integral approaches (Dragulescu, Kleinert).
ERIC Educational Resources Information Center
Bain, Kinsey; Rodriguez, Jon-Marc G.; Moon, Alena; Towns, Marcy H.
2018-01-01
Chemical kinetics is a highly quantitative content area that involves the use of multiple mathematical representations to model processes and is a context that is under-investigated in the literature. This qualitative study explored undergraduate student integration of chemistry and mathematics during problem solving in the context of chemical…
The Bologna Process between Structural Convergence and Institutional Diversity
ERIC Educational Resources Information Center
Dunkel, Torsten
2009-01-01
The merging of the Bologna and the Copenhagen processes into a single European education area appears appropriate, especially as general, vocational, adult and academic education are to be integrated in a future European Qualification Framework (EQF). This is the backdrop to the following description of the Bologna process, which was originally…
NASA Astrophysics Data System (ADS)
Qin, Rufu; Lin, Liangzhao
2017-06-01
Coastal seiches have become an increasingly important issue in coastal science and present many challenges, particularly when attempting to provide warning services. This paper presents the methodologies, techniques and integrated services adopted for the design and implementation of a Seiches Monitoring and Forecasting Integration Framework (SMAF-IF). The SMAF-IF is an integrated system with different types of sensors and numerical models and incorporates the Geographic Information System (GIS) and web techniques, which focuses on coastal seiche events detection and early warning in the North Jiangsu shoal, China. The in situ sensors perform automatic and continuous monitoring of the marine environment status and the numerical models provide the meteorological and physical oceanographic parameter estimates. A model outputs processing software was developed in C# language using ArcGIS Engine functions, which provides the capabilities of automatically generating visualization maps and warning information. Leveraging the ArcGIS Flex API and ASP.NET web services, a web based GIS framework was designed to facilitate quasi real-time data access, interactive visualization and analysis, and provision of early warning services for end users. The integrated framework proposed in this study enables decision-makers and the publics to quickly response to emergency coastal seiche events and allows an easy adaptation to other regional and scientific domains related to real-time monitoring and forecasting.
Dynamical Systems Theory: Application to Pedagogy
NASA Astrophysics Data System (ADS)
Abraham, Jane L.
Theories of learning affect how cognition is viewed, and this subsequently leads to the style of pedagogical practice that is used in education. Traditionally, educators have relied on a variety of theories on which to base pedagogy. Behavioral learning theories influenced the teaching/learning process for over 50 years. In the 1960s, the information processing approach brought the mind back into the learning process. The current emphasis on constructivism integrates the views of Piaget, Vygotsky, and cognitive psychology. Additionally, recent scientific advances have allowed researchers to shift attention to biological processes in cognition. The problem is that these theories do not provide an integrated approach to understanding principles responsible for differences among students in cognitive development and learning ability. Dynamical systems theory offers a unifying theoretical framework to explain the wider context in which learning takes place and the processes involved in individual learning. This paper describes how principles of Dynamic Systems Theory can be applied to cognitive processes of students, the classroom community, motivation to learn, and the teaching/learning dynamic giving educational psychologists a framework for research and pedagogy.
A Spiritual Framework in Incest Survivors Treatment
ERIC Educational Resources Information Center
Beveridge, Kelli; Cheung, Monit
2004-01-01
Through an examination of recent incest treatment development, this article emphasizes the theoretical concept of "integration" within the treatment process for female adult incest survivors. Spirituality as a therapeutic foundation is discussed with examples of therapeutic techniques. A case study illustrates the psycho-spiritual process of…
ERIC Educational Resources Information Center
Smyth, Emer; Gangl, Markus; Raffe, David; Hannan, Damian F.; McCoy, Selina
This project aimed to develop a more comprehensive conceptual framework of school-to-work transitions in different national contexts and apply this framework to the empirical analysis of transition processes across European countries. It drew on these two data sources: European Community Labor Force Survey and integrated databases on national…
Mentoring within a Community of Practice for Faculty Development: Adding Value to a CTL Role
ERIC Educational Resources Information Center
Calderwood, Patricia E.; Klaf, Suzanna
2015-01-01
E. R. Smith, P. E. Calderwood, F. Dohm, and P. Gill Lopez's (2013) model of integrated mentoring within a community of practice framework draws attention to how mentoring as practice, identity, and process gives shape and character to a community of practice for higher education faculty and alerts us to several challenges such a framework makes…
Processes Asunder: Acquisition & Planning Misfits
2009-03-26
Establishing six Business Enterprise Priorities ( BEPs ) to focus the Department’s business transformation efforts, which now guide DoD investment decisions...three phases which look very much like Milestone A, B, and C of the previously existing Life Cycle Management Framework . With this obvious redundancy...February 2002). 30 6 Defense Acquisition University, “Integrated Defense Acquisition, Technology, & Logistics Life Cycle Management Framework , version 5.2
SupportNet for Frontline Behavioral Health Providers
2014-06-30
social -cognitive theory perspective ( Bandura , 1997), the proposed website and integrated treatment would enhance the perceived social environmental...Objective 2: We will evaluate the utility of social cognitive theory as a framework for understanding the stress process for military mental health...healthcare providers. SupportNet, based on the theoretical framework of social cognitive theory , utilizes web-based support system with coaching to
ERIC Educational Resources Information Center
Grant, Cynthia; Osanloo, Azadeh
2014-01-01
The theoretical framework is one of the most important aspects in the research process, yet is often misunderstood by doctoral candidates as they prepare their dissertation research study. The importance of theory-driven thinking and acting is emphasized in relation to the selection of a topic, the development of research questions, the…
A Generic Ground Framework for Image Expertise Centres and Small-Sized Production Centres
NASA Astrophysics Data System (ADS)
Sellé, A.
2009-05-01
Initiated by the Pleiadas Earth Observation Program, the CNES (French Space Agency) has developed a generic collaborative framework for its image quality centre, highly customisable for any upcoming expertise centre. This collaborative framework has been design to be used by a group of experts or scientists that want to share data and processings and manage interfaces with external entities. Its flexible and scalable architecture complies with the core requirements: defining a user data model with no impact on the software (generic access data), integrating user processings with a GUI builder and built-in APIs, and offering a scalable architecture to fit any preformance requirement and accompany growing projects. The CNES jas given licensing grants for two software companies that will be able to redistribute this framework to any customer.
Rapid development of Proteomic applications with the AIBench framework.
López-Fernández, Hugo; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Méndez Reboredo, José R; Santos, Hugo M; Carreira, Ricardo J; Capelo-Martínez, José L; Fdez-Riverola, Florentino
2011-09-15
In this paper we present two case studies of Proteomics applications development using the AIBench framework, a Java desktop application framework mainly focused in scientific software development. The applications presented in this work are Decision Peptide-Driven, for rapid and accurate protein quantification, and Bacterial Identification, for Tuberculosis biomarker search and diagnosis. Both tools work with mass spectrometry data, specifically with MALDI-TOF spectra, minimizing the time required to process and analyze the experimental data. Copyright 2011 The Author(s). Published by Journal of Integrative Bioinformatics.
Neurocognitive mechanisms of perception-action coordination: a review and theoretical integration.
Ridderinkhof, K Richard
2014-10-01
The present analysis aims at a theoretical integration of, and a systems-neuroscience perspective on, a variety of historical and contemporary views on perception-action coordination (PAC). We set out to determine the common principles or lawful linkages between sensory and motor systems that explain how perception is action-oriented and how action is perceptually guided. To this end, we analyze the key ingredients to such an integrated framework, examine the architecture of dual-system conjectures of PAC, and endeavor in an historical analysis of the key characteristics, mechanisms, and phenomena of PACs. This analysis will reveal that dual-systems views are in need of fundamental re-thinking, and its elements will be amalgamated with current views on action-oriented predictive processing into a novel integrative theoretical framework (IMPPACT: Impetus, Motivation, and Prediction in Perception-Action Coordination theory). From this framework and its neurocognitive architecture we derive a number of non-trivial predictions regarding conative, motive-driven PAC. We end by presenting a brief outlook on how IMPPACT might present novel insights into certain pathologies and into action expertise. Copyright © 2014 Elsevier Ltd. All rights reserved.
Integrating medical and research information: a big data approach.
Tilve Álvarez, Carlos M; Ayora Pais, Alberto; Ruíz Romero, Cristina; Llamas Gómez, Daniel; Carrajo García, Lino; Blanco García, Francisco J; Vázquez González, Guillermo
2015-01-01
Most of the information collected in different fields by Instituto de Investigación Biomédica de A Coruña (INIBIC) is classified as unstructured due to its high volume and heterogeneity. This situation, linked to the recent requirement of integrating it to the medical information, makes it necessary to implant specific architectures to collect and organize it before it can be analysed. The purpose of this article is to present the Hadoop framework as a solution to the problem of integrating research information in the Business Intelligence field. This framework can collect, explore, process and structure the aforementioned information, which allow us to develop an equivalent function to a data mart in an Intelligence Business system.
Hamm, Jay A; Hasson-Ohayon, Ilanit; Kukla, Marina; Lysaker, Paul H
2013-01-01
Although the role and relative prominence of psychotherapy in the treatment of schizophrenia has fluctuated over time, an analysis of the history of psychotherapy for schizophrenia, focusing on findings from the recovery movement, reveals recent trends including the emergence of the development of integrative psychotherapy approaches. The authors suggest that the recovery movement has revealed limitations in traditional approaches to psychotherapy, and has provided opportunities for integrative approaches to emerge as a mechanism for promoting recovery in persons with schizophrenia. Five approaches to integrative psychotherapy for persons with schizophrenia are presented, and a shared conceptual framework that allows these five approaches to be compatible with one another is proposed. The conceptual framework is consistent with theories of recovery and emphasizes interpersonal attachment, personal narrative, and metacognitive processes. Implications for future research on integrative psychotherapy are considered.
A Mixed-Methods Research Framework for Healthcare Process Improvement.
Bastian, Nathaniel D; Munoz, David; Ventura, Marta
2016-01-01
The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.
Time perception impairs sensory-motor integration in Parkinson’s disease
2013-01-01
It is well known that perception and estimation of time are fundamental for the relationship between humans and their environment. However, this temporal information processing is inefficient in patients with Parkinson’ disease (PD), resulting in temporal judgment deficits. In general, the pathophysiology of PD has been described as a dysfunction in the basal ganglia, which is a multisensory integration station. Thus, a deficit in the sensorimotor integration process could explain many of the Parkinson symptoms, such as changes in time perception. This physiological distortion may be better understood if we analyze the neurobiological model of interval timing, expressed within the conceptual framework of a traditional information-processing model called “Scalar Expectancy Theory”. Therefore, in this review we discuss the pathophysiology and sensorimotor integration process in PD, the theories and neural basic mechanisms involved in temporal processing, and the main clinical findings about the impact of time perception in PD. PMID:24131660
Integration of CBIR in radiological routine in accordance with IHE
NASA Astrophysics Data System (ADS)
Welter, Petra; Deserno, Thomas M.; Fischer, Benedikt; Wein, Berthold B.; Ott, Bastian; Günther, Rolf W.
2009-02-01
Increasing use of digital imaging processing leads to an enormous amount of imaging data. The access to picture archiving and communication systems (PACS), however, is solely textually, leading to sparse retrieval results because of ambiguous or missing image descriptions. Content-based image retrieval (CBIR) systems can improve the clinical diagnostic outcome significantly. However, current CBIR systems are not able to integrate their results with clinical workflow and PACS. Existing communication standards like DICOM and HL7 leave many options for implementation and do not ensure full interoperability. We present a concept of the standardized integration of a CBIR system for the radiology workflow in accordance with the Integrating the Healthcare Enterprise (IHE) framework. This is based on the IHE integration profile 'Post-Processing Workflow' (PPW) defining responsibilities as well as standardized communication and utilizing the DICOM Structured Report (DICOM SR). Because nowadays most of PACS and RIS systems are not yet fully IHE compliant to PPW, we also suggest an intermediate approach with the concepts of the CAD-PACS Toolkit. The integration is independent of the particular PACS and RIS system. Therefore, it supports the widespread application of CBIR in radiological routine. As a result, the approach is exemplarily applied to the Image Retrieval in Medical Applications (IRMA) framework.
Creating a process for incorporating epidemiological modelling into outbreak management decisions.
Akselrod, Hana; Mercon, Monica; Kirkeby Risoe, Petter; Schlegelmilch, Jeffrey; McGovern, Joanne; Bogucki, Sandy
2012-01-01
Modern computational models of infectious diseases greatly enhance our ability to understand new infectious threats and assess the effects of different interventions. The recently-released CDC Framework for Preventing Infectious Diseases calls for increased use of predictive modelling of epidemic emergence for public health preparedness. Currently, the utility of these technologies in preparedness and response to outbreaks is limited by gaps between modelling output and information requirements for incident management. The authors propose an operational structure that will facilitate integration of modelling capabilities into action planning for outbreak management, using the Incident Command System (ICS) and Synchronization Matrix framework. It is designed to be adaptable and scalable for use by state and local planners under the National Response Framework (NRF) and Emergency Support Function #8 (ESF-8). Specific epidemiological modelling requirements are described, and integrated with the core processes for public health emergency decision support. These methods can be used in checklist format to align prospective or real-time modelling output with anticipated decision points, and guide strategic situational assessments at the community level. It is anticipated that formalising these processes will facilitate translation of the CDC's policy guidance from theory to practice during public health emergencies involving infectious outbreaks.
A Framework for Integrating Oceanographic Data Repositories
NASA Astrophysics Data System (ADS)
Rozell, E.; Maffei, A. R.; Beaulieu, S. E.; Fox, P. A.
2010-12-01
Oceanographic research covers a broad range of science domains and requires a tremendous amount of cross-disciplinary collaboration. Advances in cyberinfrastructure are making it easier to share data across disciplines through the use of web services and community vocabularies. Best practices in the design of web services and vocabularies to support interoperability amongst science data repositories are only starting to emerge. Strategic design decisions in these areas are crucial to the creation of end-user data and application integration tools. We present S2S, a novel framework for deploying customizable user interfaces to support the search and analysis of data from multiple repositories. Our research methods follow the Semantic Web methodology and technology development process developed by Fox et al. This methodology stresses the importance of close scientist-technologist interactions when developing scientific use cases, keeping the project well scoped and ensuring the result meets a real scientific need. The S2S framework motivates the development of standardized web services with well-described parameters, as well as the integration of existing web services and applications in the search and analysis of data. S2S also encourages the use and development of community vocabularies and ontologies to support federated search and reduce the amount of domain expertise required in the data discovery process. S2S utilizes the Web Ontology Language (OWL) to describe the components of the framework, including web service parameters, and OpenSearch as a standard description for web services, particularly search services for oceanographic data repositories. We have created search services for an oceanographic metadata database, a large set of quality-controlled ocean profile measurements, and a biogeographic search service. S2S provides an application programming interface (API) that can be used to generate custom user interfaces, supporting data and application integration across these repositories and other web resources. Although initially targeted towards a general oceanographic audience, the S2S framework shows promise in many science domains, inspired in part by the broad disciplinary coverage of oceanography. This presentation will cover the challenges addressed by the S2S framework, the research methods used in its development, and the resulting architecture for the system. It will demonstrate how S2S is remarkably extensible, and can be generalized to many science domains. Given these characteristics, the framework can simplify the process of data discovery and analysis for the end user, and can help to shift the responsibility of search interface development away from data managers.
NASA Astrophysics Data System (ADS)
Schmitt, R. J. P.; Castelletti, A.; Bizzi, S.
2014-12-01
Understanding sediment transport processes at the river basin scale, their temporal spectra and spatial patterns is key to identify and minimize morphologic risks associated to channel adjustments processes. This work contributes a stochastic framework for modeling bed-load connectivity based on recent advances in the field (e.g., Bizzi & Lerner, 2013; Czubas & Foufoulas-Georgiu, 2014). It presents river managers with novel indicators from reach scale vulnerability to channel adjustment in large river networks with sparse hydrologic and sediment observations. The framework comprises three steps. First, based on a distributed hydrological model and remotely sensed information, the framework identifies a representative grain size class for each reach. Second, sediment residence time distributions are calculated for each reach in a Monte-Carlo approach applying standard sediment transport equations driven by local hydraulic conditions. Third, a network analysis defines the up- and downstream connectivity for various travel times resulting in characteristic up/downstream connectivity signatures for each reach. Channel vulnerability indicators quantify the imbalance between up/downstream connectivity for each travel time domain, representing process dependent latency of morphologic response. Last, based on the stochastic core of the model, a sensitivity analysis identifies drivers of change and major sources of uncertainty in order to target key detrimental processes and to guide effective gathering of additional data. The application, limitation and integration into a decision analytic framework is demonstrated for a major part of the Red River Basin in Northern Vietnam (179.000 km2). Here, a plethora of anthropic alterations ranging from large reservoir construction to land-use changes results in major downstream deterioration and calls for deriving concerted sediment management strategies to mitigate current and limit future morphologic alterations.
Sinden, Kathryn; MacDermid, Joy C
2014-03-01
Employers are tasked with developing injury management and return-to-work (RTW) programs in response to occupational health and safety policies. Physical demands analyses (PDAs) are the cornerstone of injury management and RTW development. Synthesizing and contextualizing policy knowledge for use in occupational program development, including PDAs, is challenging due to multiple stakeholder involvement. Few studies have used a knowledge translation theoretical framework to facilitate policy-based interventions in occupational contexts. The primary aim of this case study was to identify how constructs of the knowledge-to-action (KTA) framework were reflected in employer stakeholder-researcher collaborations during development of a firefighter PDA. Four stakeholder meetings were conducted with employee participants who had experience using PDAs in their occupational role. Directed content analysis informed analyses of meeting minutes, stakeholder views and personal reflections recorded throughout the case. Existing knowledge sources including local data, stakeholder experiences, policies and priorities were synthesized and tailored to develop a PDA in response to the barriers and facilitators identified by the firefighters. The flexibility of the KTA framework and synthesis of multiple knowledge sources were identified strengths. The KTA Action cycle was useful in directing the overall process but insufficient for directing the specific aspects of PDA development. Integration of specific PDA guidelines into the process provided explicit direction on best practices in tailoring the PDA and knowledge synthesis. Although the themes of the KTA framework were confirmed in our analysis, order modification of the KTA components was required. Despite a complex context with divergent perspectives successful implementation of a draft PDA was achieved. The KTA framework facilitated knowledge synthesis and PDA development but specific standards and modifications to the KTA framework were needed to enhance process structure. Flexibility for modification and integration of PDA practice guidelines were identified as assets of the KTA framework during its application.
Erdoğdu, Utku; Tan, Mehmet; Alhajj, Reda; Polat, Faruk; Rokne, Jon; Demetrick, Douglas
2013-01-01
The availability of enough samples for effective analysis and knowledge discovery has been a challenge in the research community, especially in the area of gene expression data analysis. Thus, the approaches being developed for data analysis have mostly suffered from the lack of enough data to train and test the constructed models. We argue that the process of sample generation could be successfully automated by employing some sophisticated machine learning techniques. An automated sample generation framework could successfully complement the actual sample generation from real cases. This argument is validated in this paper by describing a framework that integrates multiple models (perspectives) for sample generation. We illustrate its applicability for producing new gene expression data samples, a highly demanding area that has not received attention. The three perspectives employed in the process are based on models that are not closely related. The independence eliminates the bias of having the produced approach covering only certain characteristics of the domain and leading to samples skewed towards one direction. The first model is based on the Probabilistic Boolean Network (PBN) representation of the gene regulatory network underlying the given gene expression data. The second model integrates Hierarchical Markov Model (HIMM) and the third model employs a genetic algorithm in the process. Each model learns as much as possible characteristics of the domain being analysed and tries to incorporate the learned characteristics in generating new samples. In other words, the models base their analysis on domain knowledge implicitly present in the data itself. The developed framework has been extensively tested by checking how the new samples complement the original samples. The produced results are very promising in showing the effectiveness, usefulness and applicability of the proposed multi-model framework.
Operationalizing the Learning Health Care System in an Integrated Delivery System
Psek, Wayne A.; Stametz, Rebecca A.; Bailey-Davis, Lisa D.; Davis, Daniel; Darer, Jonathan; Faucett, William A.; Henninger, Debra L.; Sellers, Dorothy C.; Gerrity, Gloria
2015-01-01
Introduction: The Learning Health Care System (LHCS) model seeks to utilize sophisticated technologies and competencies to integrate clinical operations, research and patient participation in order to continuously generate knowledge, improve care, and deliver value. Transitioning from concept to practical application of an LHCS presents many challenges but can yield opportunities for continuous improvement. There is limited literature and practical experience available in operationalizing the LHCS in the context of an integrated health system. At Geisinger Health System (GHS) a multi-stakeholder group is undertaking to enhance organizational learning and develop a plan for operationalizing the LHCS system-wide. We present a framework for operationalizing continuous learning across an integrated delivery system and lessons learned through the ongoing planning process. Framework: The framework focuses attention on nine key LHCS operational components: Data and Analytics; People and Partnerships; Patient and Family Engagement; Ethics and Oversight; Evaluation and Methodology; Funding; Organization; Prioritization; and Deliverables. Definitions, key elements and examples for each are presented. The framework is purposefully broad for application across different organizational contexts. Conclusion: A realistic assessment of the culture, resources and capabilities of the organization related to learning is critical to defining the scope of operationalization. Engaging patients in clinical care and discovery, including quality improvement and comparative effectiveness research, requires a defensible ethical framework that undergirds a system of strong but flexible oversight. Leadership support is imperative for advancement of the LHCS model. Findings from our ongoing work within the proposed framework may inform other organizations considering a transition to an LHCS. PMID:25992388
Nano-Enriched and Autonomous Sensing Framework for Dissolved Oxygen.
Shehata, Nader; Azab, Mohammed; Kandas, Ishac; Meehan, Kathleen
2015-08-14
This paper investigates a nano-enhanced wireless sensing framework for dissolved oxygen (DO). The system integrates a nanosensor that employs cerium oxide (ceria) nanoparticles to monitor the concentration of DO in aqueous media via optical fluorescence quenching. We propose a comprehensive sensing framework with the nanosensor equipped with a digital interface where the sensor output is digitized and dispatched wirelessly to a trustworthy data collection and analysis framework for consolidation and information extraction. The proposed system collects and processes the sensor readings to provide clear indications about the current or the anticipated dissolved oxygen levels in the aqueous media.
The ABLe change framework: a conceptual and methodological tool for promoting systems change.
Foster-Fishman, Pennie G; Watson, Erin R
2012-06-01
This paper presents a new approach to the design and implementation of community change efforts like a System of Care. Called the ABLe Change Framework, the model provides simultaneous attention to the content and process of the work, ensuring effective implementation and the pursuit of systems change. Three key strategies are employed in this model to ensure the integration of content and process efforts and effective mobilization of broad scale systems change: Systemic Action Learning Teams, Simple Rules, and Small Wins. In this paper we describe the ABLe Change Framework and present a case study in which we successfully applied this approach to one system of care effort in Michigan.
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change enco...
Indicators of Effective Policy Development & Implementation. Issue Brief #8
ERIC Educational Resources Information Center
Stonemeier, Jenny; Trader, Barb; Kaloi, Laura; Williams, Gabrielle
2016-01-01
Within the SWIFT framework, the Inclusive Policy Structure and Practice domain addresses the need for a supportive, reciprocal partnership between the school and its district or local educational agency. Therefore, intentional and effective policy decision-making processes are integral to SWIFT implementation. Such processes create opportunities…
NASA Technical Reports Server (NTRS)
Milburn, George
1992-01-01
The topics are presented in viewgraph form and include the following: National Center for Appropriate Technology (NCAT) history; technologies selection criteria; strategic plan status; implementation framework; forum composition; NCAT role as integrator; government/industry coordination; identification and selection process for demonstrations; criteria for demonstrations; criteria for non-selection; and future actions.
NASA Astrophysics Data System (ADS)
Li, Qing; Wang, Ze-yuan; Cao, Zhi-chao; Du, Rui-yang; Luo, Hao
2015-08-01
With the process of globalisation and the development of management models and information technology, enterprise cooperation and collaboration has developed from intra-enterprise integration, outsourcing and inter-enterprise integration, and supply chain management, to virtual enterprises and enterprise networks. Some midfielder enterprises begin to serve for different supply chains. Therefore, they combine related supply chains into a complex enterprise network. The main challenges for enterprise network's integration and collaboration are business process and data fragmentation beyond organisational boundaries. This paper reviews the requirements of enterprise network's integration and collaboration, as well as the development of new information technologies. Based on service-oriented architecture (SOA), collaboration modelling and collaboration agents are introduced to solve problems of collaborative management for service convergence under the condition of process and data fragmentation. A model-driven methodology is developed to design and deploy the integrating framework. An industrial experiment is designed and implemented to illustrate the usage of developed technologies in this paper.
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.
2016-12-01
Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.
E-Services quality assessment framework for collaborative networks
NASA Astrophysics Data System (ADS)
Stegaru, Georgiana; Danila, Cristian; Sacala, Ioan Stefan; Moisescu, Mihnea; Mihai Stanescu, Aurelian
2015-08-01
In a globalised networked economy, collaborative networks (CNs) are formed to take advantage of new business opportunities. Collaboration involves shared resources and capabilities, such as e-Services that can be dynamically composed to automate CN participants' business processes. Quality is essential for the success of business process automation. Current approaches mostly focus on quality of service (QoS)-based service selection and ranking algorithms, overlooking the process of service composition which requires interoperable, adaptable and secure e-Services to ensure seamless collaboration, data confidentiality and integrity. Lack of assessment of these quality attributes can result in e-Service composition failure. The quality of e-Service composition relies on the quality of each e-Service and on the quality of the composition process. Therefore, there is the need for a framework that addresses quality from both views: product and process. We propose a quality of e-Service composition (QoESC) framework for quality assessment of e-Service composition for CNs which comprises of a quality model for e-Service evaluation and guidelines for quality of e-Service composition process. We implemented a prototype considering a simplified telemedicine use case which involves a CN in e-Healthcare domain. To validate the proposed quality-driven framework, we analysed service composition reliability with and without using the proposed framework.
The interactions of multisensory integration with endogenous and exogenous attention
Tang, Xiaoyu; Wu, Jinglong; Shen, Yong
2016-01-01
Stimuli from multiple sensory organs can be integrated into a coherent representation through multiple phases of multisensory processing; this phenomenon is called multisensory integration. Multisensory integration can interact with attention. Here, we propose a framework in which attention modulates multisensory processing in both endogenous (goal-driven) and exogenous (stimulus-driven) ways. Moreover, multisensory integration exerts not only bottom-up but also top-down control over attention. Specifically, we propose the following: (1) endogenous attentional selectivity acts on multiple levels of multisensory processing to determine the extent to which simultaneous stimuli from different modalities can be integrated; (2) integrated multisensory events exert top-down control on attentional capture via multisensory search templates that are stored in the brain; (3) integrated multisensory events can capture attention efficiently, even in quite complex circumstances, due to their increased salience compared to unimodal events and can thus improve search accuracy; and (4) within a multisensory object, endogenous attention can spread from one modality to another in an exogenous manner. PMID:26546734
The interactions of multisensory integration with endogenous and exogenous attention.
Tang, Xiaoyu; Wu, Jinglong; Shen, Yong
2016-02-01
Stimuli from multiple sensory organs can be integrated into a coherent representation through multiple phases of multisensory processing; this phenomenon is called multisensory integration. Multisensory integration can interact with attention. Here, we propose a framework in which attention modulates multisensory processing in both endogenous (goal-driven) and exogenous (stimulus-driven) ways. Moreover, multisensory integration exerts not only bottom-up but also top-down control over attention. Specifically, we propose the following: (1) endogenous attentional selectivity acts on multiple levels of multisensory processing to determine the extent to which simultaneous stimuli from different modalities can be integrated; (2) integrated multisensory events exert top-down control on attentional capture via multisensory search templates that are stored in the brain; (3) integrated multisensory events can capture attention efficiently, even in quite complex circumstances, due to their increased salience compared to unimodal events and can thus improve search accuracy; and (4) within a multisensory object, endogenous attention can spread from one modality to another in an exogenous manner. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Symeonidis, Iphigenia Sofia
This paper aims to elucidate guiding concepts for the design of powerful undergraduate bioinformatics degrees which will lead to a conceptual framework for the curriculum. "Powerful" here should be understood as having truly bioinformatics objectives rather than enrichment of existing computer science or life science degrees on which bioinformatics degrees are often based. As such, the conceptual framework will be one which aims to demonstrate intellectual honesty in regards to the field of bioinformatics. A synthesis/conceptual analysis approach was followed as elaborated by Hurd (1983). The approach takes into account the following: bioinfonnatics educational needs and goals as expressed by different authorities, five undergraduate bioinformatics degrees case-studies, educational implications of bioinformatics as a technoscience and approaches to curriculum design promoting interdisciplinarity and integration. Given these considerations, guiding concepts emerged and a conceptual framework was elaborated. The practice of bioinformatics was given a closer look, which led to defining tool-integration skills and tool-thinking capacity as crucial areas of the bioinformatics activities spectrum. It was argued, finally, that a process-based curriculum as a variation of a concept-based curriculum (where the concepts are processes) might be more conducive to the teaching of bioinformatics given a foundational first year of integrated science education as envisioned by Bialek and Botstein (2004). Furthermore, the curriculum design needs to define new avenues of communication and learning which bypass the traditional disciplinary barriers of academic settings as undertaken by Tador and Tidmor (2005) for graduate studies.
NASA Astrophysics Data System (ADS)
El-Gafy, Mohamed Anwar
Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.
Hoekman, Berend; Breitling, Rainer; Suits, Frank; Bischoff, Rainer; Horvatovich, Peter
2012-01-01
Data processing forms an integral part of biomarker discovery and contributes significantly to the ultimate result. To compare and evaluate various publicly available open source label-free data processing workflows, we developed msCompare, a modular framework that allows the arbitrary combination of different feature detection/quantification and alignment/matching algorithms in conjunction with a novel scoring method to evaluate their overall performance. We used msCompare to assess the performance of workflows built from modules of publicly available data processing packages such as SuperHirn, OpenMS, and MZmine and our in-house developed modules on peptide-spiked urine and trypsin-digested cerebrospinal fluid (CSF) samples. We found that the quality of results varied greatly among workflows, and interestingly, heterogeneous combinations of algorithms often performed better than the homogenous workflows. Our scoring method showed that the union of feature matrices of different workflows outperformed the original homogenous workflows in some cases. msCompare is open source software (https://trac.nbic.nl/mscompare), and we provide a web-based data processing service for our framework by integration into the Galaxy server of the Netherlands Bioinformatics Center (http://galaxy.nbic.nl/galaxy) to allow scientists to determine which combination of modules provides the most accurate processing for their particular LC-MS data sets. PMID:22318370
An APOS Analysis of Natural Science Students' Understanding of Integration
ERIC Educational Resources Information Center
Maharaj, Aneshkumar
2014-01-01
This article reports on a study which used the APOS (action-process-object-schema) Theory framework and a classification of errors to investigate university students' understanding of the integration concept and its applications. Research was done at the Westville Campus of the University of KwaZulu-Natal in South Africa. The relevant rules for…
PNNL Data-Intensive Computing for a Smarter Energy Grid
Carol Imhoff; Zhenyu (Henry) Huang; Daniel Chavarria
2017-12-09
The Middleware for Data-Intensive Computing (MeDICi) Integration Framework, an integrated platform to solve data analysis and processing needs, supports PNNL research on the U.S. electric power grid. MeDICi is enabling development of visualizations of grid operations and vulnerabilities, with goal of near real-time analysis to aid operators in preventing and mitigating grid failures.
MeDICi Software Superglue for Data Analysis Pipelines
Ian Gorton
2017-12-09
The Middleware for Data-Intensive Computing (MeDICi) Integration Framework is an integrated middleware platform developed to solve data analysis and processing needs of scientists across many domains. MeDICi is scalable, easily modified, and robust to multiple languages, protocols, and hardware platforms, and in use today by PNNL scientists for bioinformatics, power grid failure analysis, and text analysis.
A New Biogeochemical Computational Framework Integrated within the Community Land Model
NASA Astrophysics Data System (ADS)
Fang, Y.; Li, H.; Liu, C.; Huang, M.; Leung, L.
2012-12-01
Terrestrial biogeochemical processes, particularly carbon cycle dynamics, have been shown to significantly influence regional and global climate changes. Modeling terrestrial biogeochemical processes within the land component of Earth System Models such as the Community Land model (CLM), however, faces three major challenges: 1) extensive efforts in modifying modeling structures and rewriting computer programs to incorporate biogeochemical processes with increasing complexity, 2) expensive computational cost to solve the governing equations due to numerical stiffness inherited from large variations in the rates of biogeochemical processes, and 3) lack of an efficient framework to systematically evaluate various mathematical representations of biogeochemical processes. To address these challenges, we introduce a new computational framework to incorporate biogeochemical processes into CLM, which consists of a new biogeochemical module with a generic algorithm and reaction database. New and updated biogeochemical processes can be incorporated into CLM without significant code modification. To address the stiffness issue, algorithms and criteria will be developed to identify fast processes, which will be replaced with algebraic equations and decoupled from slow processes. This framework can serve as a generic and user-friendly platform to test out different mechanistic process representations and datasets and gain new insight on the behavior of the terrestrial ecosystems in response to climate change in a systematic way.
A Web-Based System for Monitoring and Controlling Multidisciplinary Design Projects
NASA Technical Reports Server (NTRS)
Salas, Andrea O.; Rogers, James L.
1997-01-01
In today's competitive environment, both industry and government agencies are under enormous pressure to reduce the time and cost of multidisciplinary design projects. A number of frameworks have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. An examination of current frameworks reveals weaknesses in various areas such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, in conjunction with an existing framework, can improve these areas of weakness. This paper describes a system that executes a sequence of programs, monitors and controls the design process through a Web-based interface, and visualizes intermediate and final results through the use of Java(Tm) applets. A small sample problem, which includes nine processes with two analysis programs that are coupled to an optimizer, is used to demonstrate the feasibility of this approach.
ERIC Educational Resources Information Center
Dixon, Raymond A.; Johnson, Scott D.
2012-01-01
A cognitive construct that is important when solving engineering design problems is executive control process, or metacognition. It is a central feature of human consciousness that enables one "to be aware of, monitor, and control mental processes." The framework for this study was conceptualized by integrating the model for creative design, which…
ERIC Educational Resources Information Center
Sun, Yan
2013-01-01
This dissertation reported three studies whose overarching purpose is to enhance our understanding about how teachers learn to teach by revealing the learning to teach process. Each of three studies revealed the learning to teach process from different perspectives. Guided by the Pedagogical Content Knowledge (PCK) framework, the first study…
Deserno, Thomas M; Haak, Daniel; Brandenburg, Vincent; Deserno, Verena; Classen, Christoph; Specht, Paula
2014-12-01
Especially for investigator-initiated research at universities and academic institutions, Internet-based rare disease registries (RDR) are required that integrate electronic data capture (EDC) with automatic image analysis or manual image annotation. We propose a modular framework merging alpha-numerical and binary data capture. In concordance with the Office of Rare Diseases Research recommendations, a requirement analysis was performed based on several RDR databases currently hosted at Uniklinik RWTH Aachen, Germany. With respect to the study management tool that is already successfully operating at the Clinical Trial Center Aachen, the Google Web Toolkit was chosen with Hibernate and Gilead connecting a MySQL database management system. Image and signal data integration and processing is supported by Apache Commons FileUpload-Library and ImageJ-based Java code, respectively. As a proof of concept, the framework is instantiated to the German Calciphylaxis Registry. The framework is composed of five mandatory core modules: (1) Data Core, (2) EDC, (3) Access Control, (4) Audit Trail, and (5) Terminology as well as six optional modules: (6) Binary Large Object (BLOB), (7) BLOB Analysis, (8) Standard Operation Procedure, (9) Communication, (10) Pseudonymization, and (11) Biorepository. Modules 1-7 are implemented in the German Calciphylaxis Registry. The proposed RDR framework is easily instantiated and directly integrates image management and analysis. As open source software, it may assist improved data collection and analysis of rare diseases in near future.
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.
2017-10-01
One of the STAR experiment’s modular Messaging Interface and Reliable Architecture framework (MIRA) integration goals is to provide seamless and automatic connections with the existing control systems. After an initial proof of concept and operation of the MIRA system as a parallel data collection system for online use and real-time monitoring, the STAR Software and Computing group is now working on the integration of Experimental Physics and Industrial Control System (EPICS) with MIRA’s interfaces. This integration goals are to allow functional interoperability and, later on, to replace the existing/legacy Detector Control System components at the service level. In this report, we describe the evolutionary integration process and, as an example, will discuss the EPICS Alarm Handler conversion. We review the complete upgrade procedure starting with the integration of EPICS-originated alarm signals propagation into MIRA, followed by the replacement of the existing operator interface based on Motif Editor and Display Manager (MEDM) with modern portable web-based Alarm Handler interface. To achieve this aim, we have built an EPICS-to-MQTT [8] bridging service, and recreated the functionality of the original Alarm Handler using low-latency web messaging technologies. The integration of EPICS alarm handling into our messaging framework allowed STAR to improve the DCS alarm awareness of existing STAR DAQ and RTS services, which use MIRA as a primary source of experiment control information.
Differentiated Technical Assistance for Sustainable Transformation. Technical Assistance Brief #2
ERIC Educational Resources Information Center
McCart, Amy; McSheehan, Michael; Sailor, Wayne
2015-01-01
Schoolwide Integrated Framework for Transformation (SWIFT) Center's technical assistance process supports states, districts, and schools as they become excellent and equitable teaching and learning environments for "all" students. Each school with support from its district begins this process from its own starting point and travels its…
Conceptualization of Enterprise Systems Education Using an Experiential Learning Framework
ERIC Educational Resources Information Center
Fathelrahman, Adil; Kabbar, Eltahir
2018-01-01
The use of enterprise systems to facilitate cross-functional integration within an organization's functional areas is becoming increasingly important. Business schools around the globe have realized the importance of using enterprise systems to facilitate the teaching of business processes and business processes transformation. The authors adopt…
Ecohydrological processes and ecosystem services in the Anthropocene: a review
Ge Sun; Dennis Hallema; Heidi Asbjornsen
2017-01-01
The framework for ecosystem services has been increasingly used in integrated watershed ecosystem management practices that involve scientists, engineers, managers, and policy makers. The objective of this review is to explore the intimate connections between ecohydrological processes and water-related ecosystem services in human-dominated ecosystems in the...
Complex multidisciplinary system composition for aerospace vehicle conceptual design
NASA Astrophysics Data System (ADS)
Gonzalez, Lex
Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.
Vago, David R; Silbersweig, David A
2012-01-01
Mindfulness-as a state, trait, process, type of meditation, and intervention has proven to be beneficial across a diverse group of psychological disorders as well as for general stress reduction. Yet, there remains a lack of clarity in the operationalization of this construct, and underlying mechanisms. Here, we provide an integrative theoretical framework and systems-based neurobiological model that explains the mechanisms by which mindfulness reduces biases related to self-processing and creates a sustainable healthy mind. Mindfulness is described through systematic mental training that develops meta-awareness (self-awareness), an ability to effectively modulate one's behavior (self-regulation), and a positive relationship between self and other that transcends self-focused needs and increases prosocial characteristics (self-transcendence). This framework of self-awareness, -regulation, and -transcendence (S-ART) illustrates a method for becoming aware of the conditions that cause (and remove) distortions or biases. The development of S-ART through meditation is proposed to modulate self-specifying and narrative self-networks through an integrative fronto-parietal control network. Relevant perceptual, cognitive, emotional, and behavioral neuropsychological processes are highlighted as supporting mechanisms for S-ART, including intention and motivation, attention regulation, emotion regulation, extinction and reconsolidation, prosociality, non-attachment, and decentering. The S-ART framework and neurobiological model is based on our growing understanding of the mechanisms for neurocognition, empirical literature, and through dismantling the specific meditation practices thought to cultivate mindfulness. The proposed framework will inform future research in the contemplative sciences and target specific areas for development in the treatment of psychological disorders.
Vago, David R.; Silbersweig, David A.
2012-01-01
Mindfulness—as a state, trait, process, type of meditation, and intervention has proven to be beneficial across a diverse group of psychological disorders as well as for general stress reduction. Yet, there remains a lack of clarity in the operationalization of this construct, and underlying mechanisms. Here, we provide an integrative theoretical framework and systems-based neurobiological model that explains the mechanisms by which mindfulness reduces biases related to self-processing and creates a sustainable healthy mind. Mindfulness is described through systematic mental training that develops meta-awareness (self-awareness), an ability to effectively modulate one's behavior (self-regulation), and a positive relationship between self and other that transcends self-focused needs and increases prosocial characteristics (self-transcendence). This framework of self-awareness, -regulation, and -transcendence (S-ART) illustrates a method for becoming aware of the conditions that cause (and remove) distortions or biases. The development of S-ART through meditation is proposed to modulate self-specifying and narrative self-networks through an integrative fronto-parietal control network. Relevant perceptual, cognitive, emotional, and behavioral neuropsychological processes are highlighted as supporting mechanisms for S-ART, including intention and motivation, attention regulation, emotion regulation, extinction and reconsolidation, prosociality, non-attachment, and decentering. The S-ART framework and neurobiological model is based on our growing understanding of the mechanisms for neurocognition, empirical literature, and through dismantling the specific meditation practices thought to cultivate mindfulness. The proposed framework will inform future research in the contemplative sciences and target specific areas for development in the treatment of psychological disorders. PMID:23112770
Framework for the Intelligent Transportation System (ITS) Evaluation : ITS Integration Activities
DOT National Transportation Integrated Search
2006-08-01
Intelligent Transportation Systems (ITS) represent a significant opportunity to improve the efficiency and safety of the surface transportation system. ITS includes technologies to support information processing, communications, surveillance and cont...
Climate Change, Nutrition, and Bottom-Up and Top-Down Food Web Processes.
Rosenblatt, Adam E; Schmitz, Oswald J
2016-12-01
Climate change ecology has focused on climate effects on trophic interactions through the lenses of temperature effects on organismal physiology and phenological asynchronies. Trophic interactions are also affected by the nutrient content of resources, but this topic has received less attention. Using concepts from nutritional ecology, we propose a conceptual framework for understanding how climate affects food webs through top-down and bottom-up processes impacted by co-occurring environmental drivers. The framework integrates climate effects on consumer physiology and feeding behavior with effects on resource nutrient content. It illustrates how studying responses of simplified food webs to simplified climate change might produce erroneous predictions. We encourage greater integrative complexity of climate change research on trophic interactions to resolve patterns and enhance predictive capacities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Performance measurement: integrating quality management and activity-based cost management.
McKeon, T
1996-04-01
The development of an activity-based management system provides a framework for developing performance measures integral to quality and cost management. Performance measures that cross operational boundaries and embrace core processes provide a mechanism to evaluate operational results related to strategic intention and internal and external customers. The author discusses this measurement process that allows managers to evaluate where they are and where they want to be, and to set a course of action that closes the gap between the two.
Zhou, Ronggang; Chan, Alan H. S.
2016-01-01
BACKGROUND: In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. OBJECTIVE: This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. METHODS: With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. RESULTS AND CONCLUSIONS: Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process. PMID:28035943
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meshkati, N.; Buller, B.J.; Azadeh, M.A.
1995-04-01
The goal of this research is threefold: (1) use of the Skill-, Rule-, and Knowledge-based levels of cognitive control -- the SRK framework -- to develop an integrated information processing conceptual framework (for integration of workstation, job, and team design); (2) to evaluate the user interface component of this framework -- the Ecological display; and (3) to analyze the effect of operators` individual information processing behavior and decision styles on handling plant disturbances plus their performance on, and preference for, Traditional and Ecological user interfaces. A series of studies were conducted. In Part I, a computer simulation model and amore » mathematical model were developed. In Part II, an experiment was designed and conducted at the EBR-II plant of the Argonne National Laboratory-West in Idaho Falls, Idaho. It is concluded that: the integrated SRK-based information processing model for control room operations is superior to the conventional rule-based model; operators` individual decision styles and the combination of their styles play a significant role in effective handling of nuclear power plant disturbances; use of the Ecological interface results in significantly more accurate event diagnosis and recall of various plant parameters, faster response to plant transients, and higher ratings of subject preference; and operators` decision styles affect on both their performance and preference for the Ecological interface.« less
Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago
2016-01-01
Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.
Social research design: framework for integrating philosophical and practical elements.
Cunningham, Kathryn Burns
2014-09-01
To provide and elucidate a comprehensible framework for the design of social research. An abundance of information exists concerning the process of designing social research. The overall message that can be gleaned is that numerable elements - both philosophical (ontological and epistemological assumptions and theoretical perspective) and practical (issue to be addressed, purpose, aims and research questions) - are influential in the process of selecting a research methodology and methods, and that these elements and their inter-relationships must be considered and explicated to ensure a coherent research design that enables well-founded and meaningful conclusions. There is a lack of guidance concerning the integration of practical and philosophical elements, hindering their consideration and explication. The author's PhD research into loneliness and cancer. This is a methodology paper. A guiding framework that incorporates all of the philosophical and practical elements influential in social research design is presented. The chronological and informative relationships between the elements are discussed. The framework presented can be used by social researchers to consider and explicate the practical and philosophical elements influential in the selection of a methodology and methods. It is hoped that the framework presented will aid social researchers with the design and the explication of the design of their research, thereby enhancing the credibility of their projects and enabling their research to establish well-founded and meaningful conclusions.
Integrating count and detection–nondetection data to model population dynamics
Zipkin, Elise F.; Rossman, Sam; Yackulic, Charles B.; Wiens, David; Thorson, James T.; Davis, Raymond J.; Grant, Evan H. Campbell
2017-01-01
There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture–recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection–nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection–nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection–nondetection data (1995–2014) with newly collected count data (2015–2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance.
Integrating count and detection-nondetection data to model population dynamics.
Zipkin, Elise F; Rossman, Sam; Yackulic, Charles B; Wiens, J David; Thorson, James T; Davis, Raymond J; Grant, Evan H Campbell
2017-06-01
There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture-recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection-nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection-nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection-nondetection data (1995-2014) with newly collected count data (2015-2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance. © 2017 by the Ecological Society of America.
IVHM Framework for Intelligent Integration for Vehicle Health Management
NASA Technical Reports Server (NTRS)
Paris, Deidre; Trevino, Luis C.; Watson, Michael D.
2005-01-01
Integrated Vehicle Health Management (IVHM) systems for aerospace vehicles, is the process of assessing, preserving, and restoring system functionality across flight and techniques with sensor and communication technologies for spacecraft that can generate responses through detection, diagnosis, reasoning, and adapt to system faults in support of Integrated Intelligent Vehicle Management (IIVM). These real-time responses allow the IIVM to modify the affected vehicle subsystem(s) prior to a catastrophic event. Furthermore, this framework integrates technologies which can provide a continuous, intelligent, and adaptive health state of a vehicle and use this information to improve safety and reduce costs of operations. Recent investments in avionics, health management, and controls have been directed towards IIVM. As this concept has matured, it has become clear that IIVM requires the same sensors and processing capabilities as the real-time avionics functions to support diagnosis of subsystem problems. New sensors have been proposed, in addition to augment the avionics sensors to support better system monitoring and diagnostics. As the designs have been considered, a synergy has been realized where the real-time avionics can utilize sensors proposed for diagnostics and prognostics to make better real-time decisions in response to detected failures. IIVM provides for a single system allowing modularity of functions and hardware across the vehicle. The framework that supports IIVM consists of 11 major on-board functions necessary to fully manage a space vehicle maintaining crew safety and mission objectives. These systems include the following: Guidance and Navigation; Communications and Tracking; Vehicle Monitoring; Information Transport and Integration; Vehicle Diagnostics; Vehicle Prognostics; Vehicle Mission Planning, Automated Repair and Replacement; Vehicle Control; Human Computer Interface; and Onboard Verification and Validation. Furthermore, the presented framework provides complete vehicle management which not only allows for increased crew safety and mission success through new intelligence capabilities, but also yields a mechanism for more efficient vehicle operations.
Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework
Talluto, Matthew V.; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C. Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A.; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique
2016-01-01
Aim Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Location Eastern North America (as an example). Methods Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple (Acer saccharum), an abundant tree native to eastern North America. Results For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. Main conclusions We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making. PMID:27499698
Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework.
Talluto, Matthew V; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique
2016-02-01
Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Eastern North America (as an example). Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple ( Acer saccharum ), an abundant tree native to eastern North America. For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making.
Knowledge mapping as a technique to support knowledge translation.
Ebener, S.; Khan, A.; Shademani, R.; Compernolle, L.; Beltran, M.; Lansang, Ma; Lippman, M.
2006-01-01
This paper explores the possibility of integrating knowledge mapping into a conceptual framework that could serve as a tool for understanding the many complex processes, resources and people involved in a health system, and for identifying potential gaps within knowledge translation processes in order to address them. After defining knowledge mapping, this paper presents various examples of the application of this process in health, before looking at the steps that need to be taken to identify potential gaps, to determine to what extent these gaps affect the knowledge translation process and to establish their cause. This is followed by proposals for interventions aimed at strengthening the overall process. Finally, potential limitations on the application of this framework at the country level are addressed. PMID:16917651
Close relationship processes and health: implications of attachment theory for health and disease.
Pietromonaco, Paula R; Uchino, Bert; Dunkel Schetter, Christine
2013-05-01
Health psychology has contributed significantly to understanding the link between psychological factors and health and well-being, but it has not often incorporated advances in relationship science into hypothesis generation and study design. We present one example of a theoretical model, following from a major relationship theory (attachment theory) that integrates relationship constructs and processes with biopsychosocial processes and health outcomes. We briefly describe attachment theory and present a general framework linking it to dyadic relationship processes (relationship behaviors, mediators, and outcomes) and health processes (physiology, affective states, health behavior, and health outcomes). We discuss the utility of the model for research in several health domains (e.g., self-regulation of health behavior, pain, chronic disease) and its implications for interventions and future research. This framework revealed important gaps in knowledge about relationships and health. Future work in this area will benefit from taking into account individual differences in attachment, adopting a more explicit dyadic approach, examining more integrated models that test for mediating processes, and incorporating a broader range of relationship constructs that have implications for health. A theoretical framework for studying health that is based in relationship science can accelerate progress by generating new research directions designed to pinpoint the mechanisms through which close relationships promote or undermine health. Furthermore, this knowledge can be applied to develop more effective interventions to help individuals and their relationship partners with health-related challenges. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Close Relationship Processes and Health: Implications of Attachment Theory for Health and Disease
Pietromonaco, Paula R.; Uchino, Bert; Dunkel Schetter, Christine
2013-01-01
Objectives Health psychology has contributed significantly to understanding the link between psychological factors and health and well-being, but it has not often incorporated advances in relationship science into hypothesis generation and study design. We present one example of a theoretical model following from a major relationship theory (attachment theory) that integrates relationship constructs and processes with biopsychosocial processes and health outcomes. Methods We briefly describe attachment theory and present a general framework linking it to dyadic relationship processes (relationship behaviors, mediators and outcomes) and health processes (physiology, affective states, health behavior and health outcomes). We discuss the utility of the model for research in several health domains (e.g., self-regulation of health behavior, pain, chronic disease) and its implications for interventions and future research. Results This framework revealed important gaps in knowledge about relationships and health. Future work in this area will benefit from taking into account individual differences in attachment, adopting a more explicit dyadic approach, examining more integrated models that test for mediating processes, and incorporating a broader range of relationship constructs that have implications for health. Conclusions A theoretical framework for studying health that is based in relationship science can accelerate progress by generating new research directions designed to pinpoint the mechanisms through which close relationships promote or undermine health. Furthermore, this knowledge can be applied to develop more effective interventions to help individuals and their relationship partners with health-related challenges. PMID:23646833
Hamm, Jay A; Hasson-Ohayon, Ilanit; Kukla, Marina; Lysaker, Paul H
2013-01-01
Although the role and relative prominence of psychotherapy in the treatment of schizophrenia has fluctuated over time, an analysis of the history of psychotherapy for schizophrenia, focusing on findings from the recovery movement, reveals recent trends including the emergence of the development of integrative psychotherapy approaches. The authors suggest that the recovery movement has revealed limitations in traditional approaches to psychotherapy, and has provided opportunities for integrative approaches to emerge as a mechanism for promoting recovery in persons with schizophrenia. Five approaches to integrative psychotherapy for persons with schizophrenia are presented, and a shared conceptual framework that allows these five approaches to be compatible with one another is proposed. The conceptual framework is consistent with theories of recovery and emphasizes interpersonal attachment, personal narrative, and metacognitive processes. Implications for future research on integrative psychotherapy are considered. PMID:23950665
The experience of agency: an interplay between prediction and postdiction
Synofzik, Matthis; Vosgerau, Gottfried; Voss, Martin
2013-01-01
The experience of agency, i.e., the registration that I am the initiator of my actions, is a basic and constant underpinning of our interaction with the world. Whereas several accounts have underlined predictive processes as the central mechanism (e.g., the comparator model by C. Frith), others emphasized postdictive inferences (e.g., post-hoc inference account by D. Wegner). Based on increasing evidence that both predictive and postdictive processes contribute to the experience of agency, we here present a unifying but at the same time parsimonious approach that reconciles these accounts: predictive and postdictive processes are both integrated by the brain according to the principles of optimal cue integration. According to this framework, predictive and postdictive processes each serve as authorship cues that are continuously integrated and weighted depending on their availability and reliability in a given situation. Both sensorimotor and cognitive signals can serve as predictive cues (e.g., internal predictions based on an efferency copy of the motor command or cognitive anticipations based on priming). Similarly, other sensorimotor and cognitive cues can each serve as post-hoc cues (e.g., visual feedback of the action or the affective valence of the action outcome). Integration and weighting of these cues might not only differ between contexts and individuals, but also between different subject and disease groups. For example, schizophrenia patients with delusions of influence seem to rely less on (probably imprecise) predictive motor signals of the action and more on post-hoc action cues like e.g., visual feedback and, possibly, the affective valence of the action outcome. Thus, the framework of optimal cue integration offers a promising approach that directly stimulates a wide range of experimentally testable hypotheses on agency processing in different subject groups. PMID:23508565
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auld, Joshua; Hope, Michael; Ley, Hubert
This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typicallymore » done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.« less
COBRApy: COnstraints-Based Reconstruction and Analysis for Python.
Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R
2013-08-08
COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/
Altieri, Nicholas; Pisoni, David B.; Townsend, James T.
2012-01-01
Summerfield (1987) proposed several accounts of audiovisual speech perception, a field of research that has burgeoned in recent years. The proposed accounts included the integration of discrete phonetic features, vectors describing the values of independent acoustical and optical parameters, the filter function of the vocal tract, and articulatory dynamics of the vocal tract. The latter two accounts assume that the representations of audiovisual speech perception are based on abstract gestures, while the former two assume that the representations consist of symbolic or featural information obtained from visual and auditory modalities. Recent converging evidence from several different disciplines reveals that the general framework of Summerfield’s feature-based theories should be expanded. An updated framework building upon the feature-based theories is presented. We propose a processing model arguing that auditory and visual brain circuits provide facilitatory information when the inputs are correctly timed, and that auditory and visual speech representations do not necessarily undergo translation into a common code during information processing. Future research on multisensory processing in speech perception should investigate the connections between auditory and visual brain regions, and utilize dynamic modeling tools to further understand the timing and information processing mechanisms involved in audiovisual speech integration. PMID:21968081
Altieri, Nicholas; Pisoni, David B; Townsend, James T
2011-01-01
Summerfield (1987) proposed several accounts of audiovisual speech perception, a field of research that has burgeoned in recent years. The proposed accounts included the integration of discrete phonetic features, vectors describing the values of independent acoustical and optical parameters, the filter function of the vocal tract, and articulatory dynamics of the vocal tract. The latter two accounts assume that the representations of audiovisual speech perception are based on abstract gestures, while the former two assume that the representations consist of symbolic or featural information obtained from visual and auditory modalities. Recent converging evidence from several different disciplines reveals that the general framework of Summerfield's feature-based theories should be expanded. An updated framework building upon the feature-based theories is presented. We propose a processing model arguing that auditory and visual brain circuits provide facilitatory information when the inputs are correctly timed, and that auditory and visual speech representations do not necessarily undergo translation into a common code during information processing. Future research on multisensory processing in speech perception should investigate the connections between auditory and visual brain regions, and utilize dynamic modeling tools to further understand the timing and information processing mechanisms involved in audiovisual speech integration.
NASA Astrophysics Data System (ADS)
Saunders, Vance M.
1999-06-01
The downsizing of the Department of Defense (DoD) and the associated reduction in budgets has re-emphasized the need for commonality, reuse, and standards with respect to the way DoD does business. DoD has implemented significant changes in how it buys weapon systems. The new emphasis is on concurrent engineering with Integrated Product and Process Development and collaboration with Integrated Product Teams. The new DoD vision includes Simulation Based Acquisition (SBA), a process supported by robust, collaborative use of simulation technology that is integrated across acquisition phases and programs. This paper discusses the Air Force Research Laboratory's efforts to use Modeling and Simulation (M&S) resources within a Collaborative Enterprise Environment to support SBA and other Collaborative Enterprise and Virtual Prototyping (CEVP) applications. The paper will discuss four technology areas: (1) a Processing Ontology that defines a hierarchically nested set of collaboration contexts needed to organize and support multi-disciplinary collaboration using M&S, (2) a partial taxonomy of intelligent agents needed to manage different M&S resource contributions to advancing the state of product development, (3) an agent- based process for interfacing disparate M&S resources into a CEVP framework, and (4) a Model-View-Control based approach to defining `a new way of doing business' for users of CEVP frameworks/systems.
Digital case-based learning system in school.
Gu, Peipei; Guo, Jiayang
2017-01-01
With the continuing growth of multi-media learning resources, it is important to offer methods helping learners to explore and acquire relevant learning information effectively. As services that organize multi-media learning materials together to support programming learning, the digital case-based learning system is needed. In order to create a case-oriented e-learning system, this paper concentrates on the digital case study of multi-media resources and learning processes with an integrated framework. An integration of multi-media resources, testing and learning strategies recommendation as the learning unit is proposed in the digital case-based learning framework. The learning mechanism of learning guidance, multi-media materials learning and testing feedback is supported in our project. An improved personalized genetic algorithm which incorporates preference information and usage degree into the crossover and mutation process is proposed to assemble the personalized test sheet for each learner. A learning strategies recommendation solution is proposed to recommend learning strategies for learners to help them to learn. The experiments are conducted to prove that the proposed approaches are capable of constructing personalized sheets and the effectiveness of the framework.
Evaluating the impact of virtualization characteristics on SaaS adoption
NASA Astrophysics Data System (ADS)
Tomás, Sara; Thomas, Manoj; Oliveira, Tiago
2018-03-01
Software as a service (SaaS) is a service model in which the applications are accessible from various client devices through internet. Several studies report possible factors driving the adoption of SaaS but none have considered the perception of the SaaS features and the organization's context. We propose an integrated research model that combines the process virtualization theory (PVT), the technology-organization-environment (TOE) framework and the institutional theory (INT). PVT seeks to explain whether processes are suitable for migration into virtual environments via an information technology-based mechanism as SaaS. The TOE framework seeks to explain the effects of the intra-organizational factors, while INT seeks to explain the effects of the inter-organizational factors on the technology adoption. This research addresses a gap in the SaaS adoption literature by studying the internal perception of the technical features of SaaS and technology, organization, and environment perspectives. Additionally, the integration of PVT, the TOE framework, and INT contributes to the information system (IS) discipline, deepening the applicability and strengths of these theories.
Digital case-based learning system in school
Gu, Peipei
2017-01-01
With the continuing growth of multi-media learning resources, it is important to offer methods helping learners to explore and acquire relevant learning information effectively. As services that organize multi-media learning materials together to support programming learning, the digital case-based learning system is needed. In order to create a case-oriented e-learning system, this paper concentrates on the digital case study of multi-media resources and learning processes with an integrated framework. An integration of multi-media resources, testing and learning strategies recommendation as the learning unit is proposed in the digital case-based learning framework. The learning mechanism of learning guidance, multi-media materials learning and testing feedback is supported in our project. An improved personalized genetic algorithm which incorporates preference information and usage degree into the crossover and mutation process is proposed to assemble the personalized test sheet for each learner. A learning strategies recommendation solution is proposed to recommend learning strategies for learners to help them to learn. The experiments are conducted to prove that the proposed approaches are capable of constructing personalized sheets and the effectiveness of the framework. PMID:29107965
Semantic and Syntactic Bases of Text Comprehension.
1985-07-25
processing . Psychological Review, 82, 407-428. Craik , F. & Lockhart , R. (1972). Levels of processing : A framework for memory research. Journal of...development, 55, 2083-2093. 56 BBN Laboratories Incorporated Perfetti. C. (1979). Levels of language and levels of processing . In L. Cermak & F. Craik ... processing cycle. Thus, the activation level of those representations that are used in ongoing cycles of integration (e.g. those related to the central
Norrman, Jenny; Volchko, Yevheniya; Hooimeijer, Fransje; Maring, Linda; Kain, Jaan-Henrik; Bardos, Paul; Broekx, Steven; Beames, Alistair; Rosén, Lars
2016-09-01
This paper presents a holistic approach to sustainable urban brownfield redevelopment where specific focus is put on the integration of a multitude of subsurface qualities in the early phases of the urban redevelopment process, i.e. in the initiative and plan phases. Achieving sustainability in brownfield redevelopment projects may be constrained by a failure of engagement between two key expert constituencies: urban planners/designers and subsurface engineers, leading to missed opportunities and unintended outcomes in the plan realisation phase. A more integrated approach delivers greater benefits. Three case studies in the Netherlands, Belgium and Sweden were used to test different sustainability assessment instruments in terms of the possibility for knowledge exchange between the subsurface and the surface sectors and in terms of cooperative learning among experts and stakeholders. Based on the lessons learned from the case studies, a generic decision process framework is suggested that supports holistic decision making. The suggested framework focuses on stakeholder involvement, communication, knowledge exchange and learning and provides an inventory of instruments that can support these processes. Copyright © 2016 Elsevier B.V. All rights reserved.
AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading
NASA Astrophysics Data System (ADS)
Leggett, Charles; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; van Gemmeren, Peter; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; ATLAS Collaboration
2017-10-01
ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying handling of features such as event and time dependent data, asynchronous callbacks, metadata, integration with the online High Level Trigger for partial processing in certain regions of interest, concurrent I/O, as well as ensuring thread safety of core services. We also report on upgrading the framework to handle Algorithms that are fully re-entrant.
A framework for integration of scientific applications into the OpenTopography workflow
NASA Astrophysics Data System (ADS)
Nandigam, V.; Crosby, C.; Baru, C.
2012-12-01
The NSF-funded OpenTopography facility provides online access to Earth science-oriented high-resolution LIDAR topography data, online processing tools, and derivative products. The underlying cyberinfrastructure employs a multi-tier service oriented architecture that is comprised of an infrastructure tier, a processing services tier, and an application tier. The infrastructure tier consists of storage, compute resources as well as supporting databases. The services tier consists of the set of processing routines each deployed as a Web service. The applications tier provides client interfaces to the system. (e.g. Portal). We propose a "pluggable" infrastructure design that will allow new scientific algorithms and processing routines developed and maintained by the community to be integrated into the OpenTopography system so that the wider earth science community can benefit from its availability. All core components in OpenTopography are available as Web services using a customized open-source Opal toolkit. The Opal toolkit provides mechanisms to manage and track job submissions, with the help of a back-end database. It allows monitoring of job and system status by providing charting tools. All core components in OpenTopography have been developed, maintained and wrapped as Web services using Opal by OpenTopography developers. However, as the scientific community develops new processing and analysis approaches this integration approach is not scalable efficiently. Most of the new scientific applications will have their own active development teams performing regular updates, maintenance and other improvements. It would be optimal to have the application co-located where its developers can continue to actively work on it while still making it accessible within the OpenTopography workflow for processing capabilities. We will utilize a software framework for remote integration of these scientific applications into the OpenTopography system. This will be accomplished by virtually extending the OpenTopography service over the various infrastructures running these scientific applications and processing routines. This involves packaging and distributing a customized instance of the Opal toolkit that will wrap the software application as an OPAL-based web service and integrate it into the OpenTopography framework. We plan to make this as automated as possible. A structured specification of service inputs and outputs along with metadata annotations encoded in XML can be utilized to automate the generation of user interfaces, with appropriate tools tips and user help features, and generation of other internal software. The OpenTopography Opal toolkit will also include the customizations that will enable security authentication, authorization and the ability to write application usage and job statistics back to the OpenTopography databases. This usage information could then be reported to the original service providers and used for auditing and performance improvements. This pluggable framework will enable the application developers to continue to work on enhancing their application while making the latest iteration available in a timely manner to the earth sciences community. This will also help us establish an overall framework that other scientific application providers will also be able to use going forward.
Bainbridge, Daryl; Brazil, Kevin; Ploeg, Jenny; Krueger, Paul; Taniguchi, Alan
2016-06-01
Healthcare integration is a priority in many countries, yet there remains little direction on how to systematically evaluate this construct to inform further development. The examination of community-based palliative care networks provides an ideal opportunity for the advancement of integration measures, in consideration of how fundamental provider cohesion is to effective care at end of life. This article presents a variable-oriented analysis from a theory-based case study of a palliative care network to help bridge the knowledge gap in integration measurement. Data from a mixed-methods case study were mapped to a conceptual framework for evaluating integrated palliative care and a visual array depicting the extent of key factors in the represented palliative care network was formulated. The study included data from 21 palliative care network administrators, 86 healthcare professionals, and 111 family caregivers, all from an established palliative care network in Ontario, Canada. The framework used to guide this research proved useful in assessing qualities of integration and functioning in the palliative care network. The resulting visual array of elements illustrates that while this network performed relatively well at the multiple levels considered, room for improvement exists, particularly in terms of interventions that could facilitate the sharing of information. This study, along with the other evaluative examples mentioned, represents important initial attempts at empirically and comprehensively examining network-integrated palliative care and healthcare integration in general. © The Author(s) 2016.
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.
A Search Algorithm for Generating Alternative Process Plans in Flexible Manufacturing System
NASA Astrophysics Data System (ADS)
Tehrani, Hossein; Sugimura, Nobuhiro; Tanimizu, Yoshitaka; Iwamura, Koji
Capabilities and complexity of manufacturing systems are increasing and striving for an integrated manufacturing environment. Availability of alternative process plans is a key factor for integration of design, process planning and scheduling. This paper describes an algorithm for generation of alternative process plans by extending the existing framework of the process plan networks. A class diagram is introduced for generating process plans and process plan networks from the viewpoint of the integrated process planning and scheduling systems. An incomplete search algorithm is developed for generating and searching the process plan networks. The benefit of this algorithm is that the whole process plan network does not have to be generated before the search algorithm starts. This algorithm is applicable to large and enormous process plan networks and also to search wide areas of the network based on the user requirement. The algorithm can generate alternative process plans and to select a suitable one based on the objective functions.
A framework for development of an intelligent system for design and manufacturing of stamping dies
NASA Astrophysics Data System (ADS)
Hussein, H. M. A.; Kumar, S.
2014-07-01
An integration of computer aided design (CAD), computer aided process planning (CAPP) and computer aided manufacturing (CAM) is required for development of an intelligent system to design and manufacture stamping dies in sheet metal industries. In this paper, a framework for development of an intelligent system for design and manufacturing of stamping dies is proposed. In the proposed framework, the intelligent system is structured in form of various expert system modules for different activities of design and manufacturing of dies. All system modules are integrated with each other. The proposed system takes its input in form of a CAD file of sheet metal part, and then system modules automate all tasks related to design and manufacturing of stamping dies. Modules are coded using Visual Basic (VB) and developed on the platform of AutoCAD software.
Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan
2018-02-01
In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.
Graham, Phillip W; Kim, Mimi M; Clinton-Sherrod, A Monique; Yaros, Anna; Richmond, Alan N; Jackson, Melvin; Corbie-Smith, Giselle
2016-03-01
Concepts of culture and diversity are necessary considerations in the scientific application of theory generation and developmental processes of preventive interventions; yet, culture and/or diversity are often overlooked until later stages (e.g., adaptation [T3] and dissemination [T4]) of the translational science process. Here, we present a conceptual framework focused on the seamless incorporation of culture and diversity throughout the various stages of the translational science process (T1-T5). Informed by a community-engaged research approach, this framework guides integration of cultural and diversity considerations at each phase with emphasis on the importance and value of "citizen scientists" being research partners to promote ecological validity. The integrated partnership covers the first phase of intervention development through final phases that ultimately facilitate more global, universal translation of changes in attitudes, norms, and systems. Our comprehensive model for incorporating culture and diversity into translational research provides a basis for further discussion and translational science development.
SenSyF Experience on Integration of EO Services in a Generic, Cloud-Based EO Exploitation Platform
NASA Astrophysics Data System (ADS)
Almeida, Nuno; Catarino, Nuno; Gutierrez, Antonio; Grosso, Nuno; Andrade, Joao; Caumont, Herve; Goncalves, Pedro; Villa, Guillermo; Mangin, Antoine; Serra, Romain; Johnsen, Harald; Grydeland, Tom; Emsley, Stephen; Jauch, Eduardo; Moreno, Jose; Ruiz, Antonio
2016-08-01
SenSyF is a cloud-based data processing framework for EO- based services. It has been pioneer in addressing Big Data issues from the Earth Observation point of view, and is a precursor of several of the technologies and methodologies that will be deployed in ESA's Thematic Exploitation Platforms and other related systems.The SenSyF system focuses on developing fully automated data management, together with access to a processing and exploitation framework, including Earth Observation specific tools. SenSyF is both a development and validation platform for data intensive applications using Earth Observation data. With SenSyF, scientific, institutional or commercial institutions developing EO- based applications and services can take advantage of distributed computational and storage resources, tailored for applications dependent on big Earth Observation data, and without resorting to deep infrastructure and technological investments.This paper describes the integration process and the experience gathered from different EO Service providers during the project.
ERIC Educational Resources Information Center
Uluay, Gulsah; Dogan, Alev
2016-01-01
The main purpose of the study is to introduce Kodu Game Lab that is created by Microsoft as an example for technology integration into learning process to pre-service science teachers with MAGDAIRE framework. The participants were in a special teaching methods course at a university in Turkey during the fall 2015 semester. Mix method research…
Toward an Integrated Gender-Linked Model of Aggression Subtypes in Early and Middle Childhood
ERIC Educational Resources Information Center
Ostrov, Jamie M.; Godleski, Stephanie A.
2010-01-01
An integrative model is proposed for understanding the development of physical and relational aggression in early and middle childhood. The central goal was to posit a new theoretical framework that expands on existing social-cognitive and gender schema models (i.e., Social Information-Processing Model of Children's Adjustment [N. R. Crick & K. A.…
Xu, Elvis G B; Leung, Kenneth M Y; Morton, Brian; Lee, Joseph H W
2015-02-01
Marine protected areas (MPAs), such as marine parks and reserves, contain natural resources of immense value to the environment and mankind. Since MPAs may be situated in close proximity to urbanized areas and influenced by anthropogenic activities (e.g. continuous discharges of contaminated waters), the marine organisms contained in such waters are probably at risk. This study aimed at developing an integrated environmental risk assessment and management (IERAM) framework for enhancing the sustainability of such MPAs. The IERAM framework integrates conventional environmental risk assessment methods with a multi-layer-DPSIR (Driver-Pressure-State-Impact-Response) conceptual approach, which can simplify the complex issues embraced by environmental management strategies and provide logical and concise management information. The IERAM process can generate a useful database, offer timely update on the status of MPAs, and assist in the prioritization of management options. We use the Cape d'Aguilar Marine Reserve in Hong Kong as an example to illustrate the IERAM framework. A comprehensive set of indicators were selected, aggregated and analyzed using this framework. Effects of management practices and programs were also assessed by comparing the temporal distributions of these indicators over a certain timeframe. Based on the obtained results, we have identified the most significant components for safeguarding the integrity of the marine reserve, and indicated the existing information gaps concerned with the management of the reserve. Apart from assessing the MPA's present condition, a successful implementation of the IERAM framework as evocated here would also facilitate better-informed decision-making and, hence, indirectly enhance the protection and conservation of the MPA's marine biodiversity. Copyright © 2014 Elsevier B.V. All rights reserved.
DKIST visible broadband imager data processing pipeline
NASA Astrophysics Data System (ADS)
Beard, Andrew; Cowan, Bruce; Ferayorni, Andrew
2014-07-01
The Daniel K. Inouye Solar Telescope (DKIST) Data Handling System (DHS) provides the technical framework and building blocks for developing on-summit instrument quality assurance and data reduction pipelines. The DKIST Visible Broadband Imager (VBI) is a first light instrument that alone will create two data streams with a bandwidth of 960 MB/s each. The high data rate and data volume of the VBI require near-real time processing capability for quality assurance and data reduction, and will be performed on-summit using Graphics Processing Unit (GPU) technology. The VBI data processing pipeline (DPP) is the first designed and developed using the DKIST DHS components, and therefore provides insight into the strengths and weaknesses of the framework. In this paper we lay out the design of the VBI DPP, examine how the underlying DKIST DHS components are utilized, and discuss how integration of the DHS framework with GPUs was accomplished. We present our results of the VBI DPP alpha release implementation of the calibration, frame selection reduction, and quality assurance display processing nodes.
Fundamental awareness: A framework for integrating science, philosophy and metaphysics
Theise, Neil D.; Kafatos, Menas C.
2016-01-01
ABSTRACT The ontologic framework of Fundamental Awareness proposed here assumes that non-dual Awareness is foundational to the universe, not arising from the interactions or structures of higher level phenomena. The framework allows comparison and integration of views from the three investigative domains concerned with understanding the nature of consciousness: science, philosophy, and metaphysics. In this framework, Awareness is the underlying reality, not reducible to anything else. Awareness and existence are the same. As such, the universe is non-material, self-organizing throughout, a holarchy of complementary, process driven, recursive interactions. The universe is both its own first observer and subject. Considering the world to be non-material and comprised, a priori, of Awareness is to privilege information over materiality, action over agency and to understand that qualia are not a “hard problem,” but the foundational elements of all existence. These views fully reflect main stream Western philosophical traditions, insights from culturally diverse contemplative and mystical traditions, and are in keeping with current scientific thinking, expressible mathematically. PMID:27489576
Fundamental awareness: A framework for integrating science, philosophy and metaphysics.
Theise, Neil D; Kafatos, Menas C
2016-01-01
The ontologic framework of Fundamental Awareness proposed here assumes that non-dual Awareness is foundational to the universe, not arising from the interactions or structures of higher level phenomena. The framework allows comparison and integration of views from the three investigative domains concerned with understanding the nature of consciousness: science, philosophy, and metaphysics. In this framework, Awareness is the underlying reality, not reducible to anything else. Awareness and existence are the same. As such, the universe is non-material, self-organizing throughout, a holarchy of complementary, process driven, recursive interactions. The universe is both its own first observer and subject. Considering the world to be non-material and comprised, a priori, of Awareness is to privilege information over materiality, action over agency and to understand that qualia are not a "hard problem," but the foundational elements of all existence. These views fully reflect main stream Western philosophical traditions, insights from culturally diverse contemplative and mystical traditions, and are in keeping with current scientific thinking, expressible mathematically.
Lo Storto, Corrado
2013-11-01
This paper presents an integrative framework to evaluate ecommerce website efficiency from the user viewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven from theories of information processing and cognition and considers the website efficiency as a measure of its quality and performance. When the users interact with the website interfaces to perform a task, they are involved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, and experiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and uncertainty, and the search (over-)time during navigation that they perceive determine the effort size - and, as a consequence, the cognitive cost amount - they have to bear to perform their task. On the contrary, task performing and result achievement provide the users with cognitive benefits, making interaction with the website potentially attractive, satisfying, and useful. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 52 ecommerce websites that sell products in the information technology and media market. A stepwise regression is performed to assess the influence of cognitive costs and benefits that mostly affect website efficiency. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Framework for integration of informal waste management sector with the formal sector in Pakistan.
Masood, Maryam; Barlow, Claire Y
2013-10-01
Historically, waste pickers around the globe have utilised urban solid waste as a principal source of livelihood. Formal waste management sectors usually perceive the informal waste collection/recycling networks as backward, unhygienic and generally incompatible with modern waste management systems. It is proposed here that through careful planning and administration, these seemingly troublesome informal networks can be integrated into formal waste management systems in developing countries, providing mutual benefits. A theoretical framework for integration based on a case study in Lahore, Pakistan, is presented. The proposed solution suggests that the municipal authority should draw up and agree on a formal work contract with the group of waste pickers already operating in the area. The proposed system is assessed using the integration radar framework to classify and analyse possible intervention points between the sectors. The integration of the informal waste workers with the formal waste management sector is not a one dimensional or single step process. An ideal solution might aim for a balanced focus on all four categories of intervention, although this may be influenced by local conditions. Not all the positive benefits will be immediately apparent, but it is expected that as the acceptance of such projects increases over time, the informal recycling economy will financially supplement the formal system in many ways.
An Evaluation of Wellness Assessment Visualizations for Older Adults
Reeder, Blaine; Yoo, Daisy; Aziz, Rafae; Thompson, Hilaire J.; Demiris, George
2015-01-01
Abstract Background Smart home technologies provide a valuable resource to unobtrusively monitor health and wellness within an older adult population. However, the breadth and density of data available along with aging associated decreases in working memory, prospective memory, spatial cognition, and processing speed can make it challenging to comprehend for older adults. We developed visualizations of smart home health data integrated into a framework of wellness. We evaluated the visualizations through focus groups with older adults and identified recommendations to guide the future development of visualizations. Materials and Methods We conducted four focus groups with older adult participants (n=31) at an independent retirement community. Participants were presented with three different visualizations from a wellness pilot study. A qualitative descriptive analysis was conducted to identify thematic content. Results We identified three themes related to processing and application of visualizations: (1) values of visualizations for wellness assessment, (2) cognitive processing approaches to visualizations, and (3) integration of health data for visualization. In addition, the focus groups highlighted key design considerations of visualizations important towards supporting decision-making and evaluation assessments within integrated health displays. Conclusions Participants found inherent value in having visualizations available to proactively engage with their healthcare provider. Integrating the visualizations into a wellness framework helped reduce the complexity of raw smart home data. There has been limited work on health visualizations from a consumer perspective, in particular for an older adult population. Creating appropriately designed visualizations is valuable towards promoting consumer involvement within the shared decision-making process of care. PMID:25401414
Integrative mental health care: from theory to practice, Part 2.
Lake, James
2008-01-01
Integrative approaches will lead to more accurate and different understandings of mental illness. Beneficial responses to complementary and alternative therapies provide important clues about the phenomenal nature of the human body in space-time and disparate biological, informational, and energetic factors associated with normal and abnormal psychological functioning. The conceptual framework of contemporary Western psychiatry includes multiple theoretical viewpoints, and there is no single best explanatory model of mental illness. Future theories of mental illness causation will not depend exclusively on empirical verification of strictly biological processes but will take into account both classically described biological processes and non-classical models, including complexity theory, resulting in more complete explanations of the characteristics and causes of symptoms and mechanisms of action that result in beneficial responses to treatments. Part 1 of this article examined the limitations of the theory and contemporary clinical methods employed in Western psychiatry and discussed implications of emerging paradigms in physics and the biological sciences for the future of psychiatry. In part 2, a practical methodology, for planning integrative assessment and treatment strategies in mental health care is proposed. Using this methodology the integrative management of moderate and severe psychiatric symptoms is reviewed in detail. As the conceptual framework of Western medicine evolves toward an increasingly integrative perspective, novel understanding of complex relationships between biological, informational, and energetic processes associated with normal psychological functioning and mental illness will lead to more effective integrative assessment and treatment strategies addressing the causes or meanings of symptoms at multiple hierarchic levels of body-brain-mind.
Integrative mental health care: from theory to practice, part 1.
Lake, James
2007-01-01
Integrative approaches will lead to more accurate and different understandings of mental illness. Beneficial responses to complementary and alternative therapies provide important clues about the phenomenal nature of the human body in space-time and disparate biological, informational, and energetic factors associated with normal and abnormal psychological functioning. The conceptual framework of contemporary Western psychiatry includes multiple theoretical viewpoints, and there is no single best explanatory model of mental illness. Future theories of mental illness causation will not depend exclusively on empirical verification of strictly biological processes but will take into account both classically described biological processes and non-classical models, including complexity theory, resulting in more complete explanations of the characteristics and causes of symptoms and mechanisms of action that result in beneficial responses to treatments. Part 1 of this article examines the limitations of the theory and contemporary clinical methods employed in Western psychiatry and discusses implications of emerging paradigms in physics and the biological sciences for the future of psychiatry. In part 2, a practical methodology for planning integrative assessment and treatment strategies in mental health care is proposed. Using this methodology the integrative management of moderate and severe psychiatric symptoms is reviewed in detail. As the conceptual framework of Western medicine evolves toward an increasingly integrative perspective, novel understandings of complex relationships between biological, informational, and energetic processes associated with normal psychological functioning and mental illness will lead to more effective integrative assessment and treatment strategies addressing the causes or meanings of symptoms at multiple hierarchic levels of body-brain-mind.
Towards a Decision Support System for Space Flight Operations
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Hogle, Charles; Ruszkowski, James
2013-01-01
The Mission Operations Directorate (MOD) at the Johnson Space Center (JSC) has put in place a Model Based Systems Engineering (MBSE) technological framework for the development and execution of the Flight Production Process (FPP). This framework has provided much added value and return on investment to date. This paper describes a vision for a model based Decision Support System (DSS) for the development and execution of the FPP and its design and development process. The envisioned system extends the existing MBSE methodology and technological framework which is currently in use. The MBSE technological framework currently in place enables the systematic collection and integration of data required for building an FPP model for a diverse set of missions. This framework includes the technology, people and processes required for rapid development of architectural artifacts. It is used to build a feasible FPP model for the first flight of spacecraft and for recurrent flights throughout the life of the program. This model greatly enhances our ability to effectively engage with a new customer. It provides a preliminary work breakdown structure, data flow information and a master schedule based on its existing knowledge base. These artifacts are then refined and iterated upon with the customer for the development of a robust end-to-end, high-level integrated master schedule and its associated dependencies. The vision is to enhance this framework to enable its application for uncertainty management, decision support and optimization of the design and execution of the FPP by the program. Furthermore, this enhanced framework will enable the agile response and redesign of the FPP based on observed system behavior. The discrepancy of the anticipated system behavior and the observed behavior may be due to the processing of tasks internally, or due to external factors such as changes in program requirements or conditions associated with other organizations that are outside of MOD. The paper provides a roadmap for the three increments of this vision. These increments include (1) hardware and software system components and interfaces with the NASA ground system, (2) uncertainty management and (3) re-planning and automated execution. Each of these increments provide value independently; but some may also enable building of a subsequent increment.
Automated event generation for loop-induced processes
Hirschi, Valentin; Mattelaer, Olivier
2015-10-22
We present the first fully automated implementation of cross-section computation and event generation for loop-induced processes. This work is integrated in the MadGraph5_aMC@NLO framework. We describe the optimisations implemented at the level of the matrix element evaluation, phase space integration and event generation allowing for the simulation of large multiplicity loop-induced processes. Along with some selected differential observables, we illustrate our results with a table showing inclusive cross-sections for all loop-induced hadronic scattering processes with up to three final states in the SM as well as for some relevant 2 → 4 processes. Furthermore, many of these are computed heremore » for the first time.« less
Glennon, Tara J.; Ausderau, Karla; Bendixen, Roxanna M.; Kuhaneck, Heather Miller; Pfeiffer, Beth; Watling, Renee; Wilkinson, Kimberly; Bodison, Stefanie C.
2017-01-01
Pediatric occupational therapy practitioners frequently provide interventions for children with differences in sensory processing and integration. Confusion exists regarding how best to intervene with these children and about how to describe and document methods. Some practitioners hold the misconception that Ayres Sensory Integration intervention is the only approach that can and should be used with this population. The issue is that occupational therapy practitioners must treat the whole client in varied environments; to do so effectively, multiple approaches to intervention often are required. This article presents a framework for conceptualizing interventions for children with differences in sensory processing and integration that incorporates multiple evidence-based approaches. To best meet the needs of the children and families seeking occupational therapy services, interventions must be focused on participation and should be multifaceted. PMID:28218599
ENVIRONMENTAL IMPACT ASSESSMENT OF A HEALTH TECHNOLOGY: A SCOPING REVIEW.
Polisena, Julie; De Angelis, Gino; Kaunelis, David; Gutierrez-Ibarluzea, Iñaki
2018-06-13
The Health Technology Expert Review Panel is an advisory body to Canadian Agency for Drugs and Technologies in Health (CADTH) that develops recommendations on health technology assessments (HTAs) for nondrug health technologies using a deliberative framework. The framework spans several domains, including the environmental impact of the health technology(ies). Our research objective was to identify articles on frameworks, methods or case studies on the environmental impact assessment of health technologies. A literature search in major databases and a focused gray literature search were conducted. The main search concepts were HTA and environmental impact/sustainability. Eligible articles were those that described a conceptual framework or methods used to conduct an environmental assessment of health technologies, and case studies on the application of an environmental assessment. From the 1,710 citations identified, thirteen publications were included. Two articles presented a framework to incorporate environmental assessment in HTAs. Other approaches described weight of evidence practices and comprehensive and integrated environmental impact assessments. Central themes derived include transparency and repeatability, integration of components in a framework or of evidence into a single outcome, data availability to ensure the accuracy of findings, and familiarity with the approach used. Each framework and methods presented have different foci related to the ecosystem, health economics, or engineering practices. Their descriptions suggested transparency, repeatability, and the integration of components or of evidence into a single outcome as their main strengths. Our review is an initial step of a larger initiative by CADTH to develop the methods and processes to address the environmental impact question in an HTA.
Aguilar-Arredondo, Andrea; Arias, Clorinda; Zepeda, Angélica
2015-01-01
Hippocampal neurogenesis occurs in the adult brain in various species, including humans. A compelling question that arose when neurogenesis was accepted to occur in the adult dentate gyrus (DG) is whether new neurons become functionally relevant over time, which is key for interpreting their potential contributions to synaptic circuitry. The functional state of adult-born neurons has been evaluated using various methodological approaches, which have, in turn, yielded seemingly conflicting results regarding the timing of maturation and functional integration. Here, we review the contributions of different methodological approaches to addressing the maturation process of adult-born neurons and their functional state, discussing the contributions and limitations of each method. We aim to provide a framework for interpreting results based on the approaches currently used in neuroscience for evaluating functional integration. As shown by the experimental evidence, adult-born neurons are prone to respond from early stages, even when they are not yet fully integrated into circuits. The ongoing integration process for the newborn neurons is characterised by different features. However, they may contribute differently to the network depending on their maturation stage. When combined, the strategies used to date convey a comprehensive view of the functional development of newly born neurons while providing a framework for approaching the critical time at which new neurons become functionally integrated and influence brain function.
A Framework for Integration of IVHM Technologies for Intelligent Integration for Vehicle Management
NASA Technical Reports Server (NTRS)
Paris, Deidre E.; Trevino, Luis; Watson, Mike
2005-01-01
As a part of the overall goal of developing Integrated Vehicle Health Management (IVHM) systems for aerospace vehicles, the NASA Faculty Fellowship Program (NFFP) at Marshall Space Flight Center has performed a pilot study on IVHM principals which integrates researched IVHM technologies in support of Integrated Intelligent Vehicle Management (IIVM). IVHM is the process of assessing, preserving, and restoring system functionality across flight and ground systems (NASA NGLT 2004). The framework presented in this paper integrates advanced computational techniques with sensor and communication technologies for spacecraft that can generate responses through detection, diagnosis, reasoning, and adapt to system faults in support of IIVM. These real-time responses allow the IIVM to modify the effected vehicle subsystem(s) prior to a catastrophic event. Furthermore, the objective of this pilot program is to develop and integrate technologies which can provide a continuous, intelligent, and adaptive health state of a vehicle and use this information to improve safety and reduce costs of operations. Recent investments in avionics, health management, and controls have been directed towards IIVM. As this concept has matured, it has become clear the IIVM requires the same sensors and processing capabilities as the real-time avionics functions to support diagnosis of subsystem problems. New sensors have been proposed, in addition, to augment the avionics sensors to support better system monitoring and diagnostics. As the designs have been considered, a synergy has been realized where the real-time avionics can utilize sensors proposed for diagnostics and prognostics to make better real-time decisions in response to detected failures. IIVM provides for a single system allowing modularity of functions and hardware across the vehicle. The framework that supports IIVM consists of 11 major on-board functions necessary to fully manage a space vehicle maintaining crew safety and mission objectives: Guidance and Navigation; Communications and Tracking; Vehicle Monitoring; Information Transport and Integration; Vehicle Diagnostics; Vehicle Prognostics; Vehicle mission Planning; Automated Repair and Replacement; Vehicle Control; Human Computer Interface; and Onboard Verification and Validation. Furthermore, the presented framework provides complete vehicle management which not only allows for increased crew safety and mission success through new intelligence capabilities, but also yields a mechanism for more efficient vehicle operations. The representative IVHM technologies for IIVH includes: 1) robust controllers for use in re-usable launch vehicles, 2) scaleable/flexible computer platform using heterogeneous communication, 3) coupled electromagnetic oscillators for enhanced communications, 4) Linux-based real-time systems, 5) genetic algorithms, 6) Bayesian Networks, 7) evolutionary algorithms, 8) dynamic systems control modeling, and 9) advanced sensing capabilities. This paper presents IVHM technologies developed under NASA's NFFP pilot project. The integration of these IVHM technologies forms the framework for IIVM.
Interoperable cross-domain semantic and geospatial framework for automatic change detection
NASA Astrophysics Data System (ADS)
Kuo, Chiao-Ling; Hong, Jung-Hong
2016-01-01
With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.
ERIC Educational Resources Information Center
Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.
2018-01-01
Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…
ERIC Educational Resources Information Center
Landolfi, Adrienne M.
2016-01-01
As accountability measures continue to increase within education, public school systems have integrated standards-based evaluation systems to formally assess professional practices among educators. The purpose of this study was to explore the extent in which the communication process between evaluators and teachers impacts teacher performance…
Antecedents of Absorptive Capacity: A New Model for Developing Learning Processes
ERIC Educational Resources Information Center
Rezaei-Zadeh, Mohammad; Darwish, Tamer K.
2016-01-01
Purpose: The purpose of this paper is to provide an integrated framework to indicate which antecedents of absorptive capacity (AC) influence its learning processes, and to propose testing of this model in future work. Design/methodology/approach Relevant literature into the antecedents of AC was critically reviewed and analysed with the objective…
ERIC Educational Resources Information Center
London, Manuel; Sessa, Valerie I.
2007-01-01
This article integrates the literature on group interaction process analysis and group learning, providing a framework for understanding how patterns of interaction develop. The model proposes how adaptive, generative, and transformative learning processes evolve and vary in their functionality. Environmental triggers for learning, the group's…
Quiroga-Campano, Ana L; Panoskaltsis, Nicki; Mantalaris, Athanasios
2018-03-02
Demand for high-value biologics, a rapidly growing pipeline, and pressure from competition, time-to-market and regulators, necessitate novel biomanufacturing approaches, including Quality by Design (QbD) principles and Process Analytical Technologies (PAT), to facilitate accelerated, efficient and effective process development platforms that ensure consistent product quality and reduced lot-to-lot variability. Herein, QbD and PAT principles were incorporated within an innovative in vitro-in silico integrated framework for upstream process development (UPD). The central component of the UPD framework is a mathematical model that predicts dynamic nutrient uptake and average intracellular ATP content, based on biochemical reaction networks, to quantify and characterize energy metabolism and its adaptive response, metabolic shifts, to maintain ATP homeostasis. The accuracy and flexibility of the model depends on critical cell type/product/clone-specific parameters, which are experimentally estimated. The integrated in vitro-in silico platform and the model's predictive capacity reduced burden, time and expense of experimentation resulting in optimal medium design compared to commercially available culture media (80% amino acid reduction) and a fed-batch feeding strategy that increased productivity by 129%. The framework represents a flexible and efficient tool that transforms, improves and accelerates conventional process development in biomanufacturing with wide applications, including stem cell-based therapies. Copyright © 2018. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Leu, Jun-Der; Lee, Larry Jung-Hsing
2017-09-01
Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.
An Integrated Framework for Process-Driven Model Construction in Disease Ecology and Animal Health
Mancy, Rebecca; Brock, Patrick M.; Kao, Rowland R.
2017-01-01
Process models that focus on explicitly representing biological mechanisms are increasingly important in disease ecology and animal health research. However, the large number of process modelling approaches makes it difficult to decide which is most appropriate for a given disease system and research question. Here, we discuss different motivations for using process models and present an integrated conceptual analysis that can be used to guide the construction of infectious disease process models and comparisons between them. Our presentation complements existing work by clarifying the major differences between modelling approaches and their relationship with the biological characteristics of the epidemiological system. We first discuss distinct motivations for using process models in epidemiological research, identifying the key steps in model design and use associated with each. We then present a conceptual framework for guiding model construction and comparison, organised according to key aspects of epidemiological systems. Specifically, we discuss the number and type of disease states, whether to focus on individual hosts (e.g., cows) or groups of hosts (e.g., herds or farms), how space or host connectivity affect disease transmission, whether demographic and epidemiological processes are periodic or can occur at any time, and the extent to which stochasticity is important. We use foot-and-mouth disease and bovine tuberculosis in cattle to illustrate our discussion and support explanations of cases in which different models are used to address similar problems. The framework should help those constructing models to structure their approach to modelling decisions and facilitate comparisons between models in the literature. PMID:29021983
An Integrated Framework for Process-Driven Model Construction in Disease Ecology and Animal Health.
Mancy, Rebecca; Brock, Patrick M; Kao, Rowland R
2017-01-01
Process models that focus on explicitly representing biological mechanisms are increasingly important in disease ecology and animal health research. However, the large number of process modelling approaches makes it difficult to decide which is most appropriate for a given disease system and research question. Here, we discuss different motivations for using process models and present an integrated conceptual analysis that can be used to guide the construction of infectious disease process models and comparisons between them. Our presentation complements existing work by clarifying the major differences between modelling approaches and their relationship with the biological characteristics of the epidemiological system. We first discuss distinct motivations for using process models in epidemiological research, identifying the key steps in model design and use associated with each. We then present a conceptual framework for guiding model construction and comparison, organised according to key aspects of epidemiological systems. Specifically, we discuss the number and type of disease states, whether to focus on individual hosts (e.g., cows) or groups of hosts (e.g., herds or farms), how space or host connectivity affect disease transmission, whether demographic and epidemiological processes are periodic or can occur at any time, and the extent to which stochasticity is important. We use foot-and-mouth disease and bovine tuberculosis in cattle to illustrate our discussion and support explanations of cases in which different models are used to address similar problems. The framework should help those constructing models to structure their approach to modelling decisions and facilitate comparisons between models in the literature.
NASA Astrophysics Data System (ADS)
Tarumi, Shinya; Kozaki, Kouji; Kitamura, Yoshinobu; Mizoguchi, Riichiro
In the recent materials research, much work aims at realization of ``functional materials'' by changing structure and/or manufacturing process with nanotechnology. However, knowledge about the relationship among function, structure and manufacturing process is not well organized. So, material designers have to consider a lot of things at the same time. It would be very helpful for them to support their design process by a computer system. In this article, we discuss a conceptual design supporting system for nano-materials. Firstly, we consider a framework for representing functional structures and manufacturing processes of nano-materials with relationships among them. We expand our former framework for representing functional knowledge based on our investigation through discussion with experts of nano-materials. The extended framework has two features: 1) it represents functional structures and manufacturing processes comprehensively, 2) it expresses parameters of function and ways with their dependencies because they are important for material design. Next, we describe a conceptual design support system we developed based on the framework with its functionalities. Lastly, we evaluate the utility of our system in terms of functionality for design supports. For this purpose, we tried to represent two real examples of material design. And then we did an evaluation experiment on conceptual design of material using our system with the collaboration of domain experts.
A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services
NASA Astrophysics Data System (ADS)
Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.
2015-12-01
Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.
The dynamics of multimodal integration: The averaging diffusion model.
Turner, Brandon M; Gao, Juan; Koenig, Scott; Palfy, Dylan; L McClelland, James
2017-12-01
We combine extant theories of evidence accumulation and multi-modal integration to develop an integrated framework for modeling multimodal integration as a process that unfolds in real time. Many studies have formulated sensory processing as a dynamic process where noisy samples of evidence are accumulated until a decision is made. However, these studies are often limited to a single sensory modality. Studies of multimodal stimulus integration have focused on how best to combine different sources of information to elicit a judgment. These studies are often limited to a single time point, typically after the integration process has occurred. We address these limitations by combining the two approaches. Experimentally, we present data that allow us to study the time course of evidence accumulation within each of the visual and auditory domains as well as in a bimodal condition. Theoretically, we develop a new Averaging Diffusion Model in which the decision variable is the mean rather than the sum of evidence samples and use it as a base for comparing three alternative models of multimodal integration, allowing us to assess the optimality of this integration. The outcome reveals rich individual differences in multimodal integration: while some subjects' data are consistent with adaptive optimal integration, reweighting sources of evidence as their relative reliability changes during evidence integration, others exhibit patterns inconsistent with optimality.
Elementary Integrated Curriculum Framework
ERIC Educational Resources Information Center
Montgomery County Public Schools, 2010
2010-01-01
The Elementary Integrated Curriculum (EIC) Framework is the guiding curriculum document for the Elementary Integrated Curriculum and represents the elementary portion of the Montgomery County (Maryland) Public Schools (MCPS) Pre-K-12 Curriculum Frameworks. The EIC Framework contains the detailed indicators and objectives that describe what…
OpenDanubia - An integrated, modular simulation system to support regional water resource management
NASA Astrophysics Data System (ADS)
Muerth, M.; Waldmann, D.; Heinzeller, C.; Hennicker, R.; Mauser, W.
2012-04-01
The already completed, multi-disciplinary research project GLOWA-Danube has developed a regional scale, integrated modeling system, which was successfully applied on the 77,000 km2 Upper Danube basin to investigate the impact of Global Change on both the natural and anthropogenic water cycle. At the end of the last project phase, the integrated modeling system was transferred into the open source project OpenDanubia, which now provides both the core system as well as all major model components to the general public. First, this will enable decision makers from government, business and management to use OpenDanubia as a tool for proactive management of water resources in the context of global change. Secondly, the model framework to support integrated simulations and all simulation models developed for OpenDanubia in the scope of GLOWA-Danube are further available for future developments and research questions. OpenDanubia allows for the investigation of water-related scenarios considering different ecological and economic aspects to support both scientists and policy makers to design policies for sustainable environmental management. OpenDanubia is designed as a framework-based, distributed system. The model system couples spatially distributed physical and socio-economic process during run-time, taking into account their mutual influence. To simulate the potential future impacts of Global Change on agriculture, industrial production, water supply, households and tourism businesses, so-called deep actor models are implemented in OpenDanubia. All important water-related fluxes and storages in the natural environment are implemented in OpenDanubia as spatially explicit, process-based modules. This includes the land surface water and energy balance, dynamic plant water uptake, ground water recharge and flow as well as river routing and reservoirs. Although the complete system is relatively demanding on data requirements and hardware requirements, the modular structure and the generic core system (Core Framework, Actor Framework) allows the application in new regions and the selection of a reduced number of modules for simulation. As part of the Open Source Initiative in GLOWA-Danube (opendanubia.glowa-danube.de) a comprehensive documentation for the system installation was created and both the program code of the framework and of all major components is licensed under the GNU General Public License. In addition, some helpful programs and scripts necessary for the operation and processing of input and result data sets are provided.
Epanchin-Niell, Rebecca S.; Boyd, James W.; Macauley, Molly K.; Scarlett, Lynn; Shapiro, Carl D.; Williams, Byron K.
2018-05-07
Executive Summary—OverviewNatural resource managers must make decisions that affect broad-scale ecosystem processes involving large spatial areas, complex biophysical interactions, numerous competing stakeholder interests, and highly uncertain outcomes. Natural and social science information and analyses are widely recognized as important for informing effective management. Chief among the systematic approaches for improving the integration of science into natural resource management are two emergent science concepts, adaptive management and ecosystem services. Adaptive management (also referred to as “adaptive decision making”) is a deliberate process of learning by doing that focuses on reducing uncertainties about management outcomes and system responses to improve management over time. Ecosystem services is a conceptual framework that refers to the attributes and outputs of ecosystems (and their components and functions) that have value for humans.This report explores how ecosystem services can be moved from concept into practice through connection to a decision framework—adaptive management—that accounts for inherent uncertainties. Simultaneously, the report examines the value of incorporating ecosystem services framing and concepts into adaptive management efforts.Adaptive management and ecosystem services analyses have not typically been used jointly in decision making. However, as frameworks, they have a natural—but to date underexplored—affinity. Both are policy and decision oriented in that they attempt to represent the consequences of resource management choices on outcomes of interest to stakeholders. Both adaptive management and ecosystem services analysis take an empirical approach to the analysis of ecological systems. This systems orientation is a byproduct of the fact that natural resource actions affect ecosystems—and corresponding societal outcomes—often across large geographic scales. Moreover, because both frameworks focus on resource systems, both must confront the analytical challenges of systems modeling—in terms of complexity, dynamics, and uncertainty.Given this affinity, the integration of ecosystem services analysis and adaptive management poses few conceptual hurdles. In this report, we synthesize discussions from two workshops that considered ways in which adaptive management approaches and ecosystem service concepts may be complementary, such that integrating them into a common framework may lead to improved natural resource management outcomes. Although the literature on adaptive management and ecosystem services is vast and growing, the report focuses specifically on the integration of these two concepts rather than aiming to provide new definitions or an indepth review or primer of the concepts individually.Key issues considered include the bidirectional links between adaptive decision making and ecosystem services, as well as the potential benefits and inevitable challenges arising in the development and use of an integrated framework. Specifically, the workshops addressed the following questions:How can application of ecosystem service analysis within an adaptive decision process improve the outcomes of management and advance understanding of ecosystem service identification, production, and valuation?How can these concepts be integrated in concept and practice?What are the constraints and challenges to integrating adaptive management and ecosystem services?And, should the integration of these concepts be moved forward to wider application—and if so, how?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monozov, Dmitriy; Lukie, Zarija
2016-04-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less
Design of Knowledge Management System for Diabetic Complication Diseases
NASA Astrophysics Data System (ADS)
Fiarni, Cut
2017-01-01
This paper examines how to develop a Model for Knowledge Management System (KMS) for diabetes complication diseases. People with diabetes have a higher risk of developing a series of serious health problems. Each patient has different condition that could lead to different disease and health problem. But, with the right information, patient could have early detection so the health risk could be minimized and avoided. Hence, the objective of this research is to propose a conceptual framework that integrates social network model, Knowledge Management activities, and content based reasoning (CBR) for designing such a diabetes health and complication disease KMS. The framework indicates that the critical knowledge management activities are in the process to find similar case and the index table for algorithm to fit the framework for the social media. With this framework, KMS developers can work with healthcare provider to easily identify the suitable IT associated with the CBR process when developing a diabetes KMS.
Conceptual design of distillation-based hybrid separation processes.
Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang
2013-01-01
Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.
NASA Astrophysics Data System (ADS)
Beller, E.; Robinson, A.; Grossinger, R.; Grenier, L.; Davenport, A.
2015-12-01
Adaptation to climate change requires redesigning our landscapes and watersheds to maximize ecological resilience at large scales and integrated across urban areas, wildlands, and a diversity of ecosystem types. However, it can be difficult for environmental managers and designers to access, interpret, and apply resilience concepts at meaningful scales and across a range of settings. To address this gap, we produced a Landscape Resilience Framework that synthesizes the latest science on the qualitative mechanisms that drive resilience of ecological functions to climate change and other large-scale stressors. The framework is designed to help translate resilience science into actionable ecosystem conservation and restoration recommendations and adaptation strategies by providing a concise but comprehensive list of considerations that will help integrate resilience concepts into urban design, conservation planning, and natural resource management. The framework is composed of seven principles that represent core attributes which determine the resilience of ecological functions within a landscape. These principles are: setting, process, connectivity, redundancy, diversity/complexity, scale, and people. For each principle we identify several key operationalizable components that help illuminate specific recommendations and actions that are likely to contribute to landscape resilience for locally appropriate species, habitats, and biological processes. We are currently using the framework to develop landscape-scale recommendations for ecological resilience in the heavily urbanized Silicon Valley, California, in collaboration with local agencies, companies, and regional experts. The resilience framework is being applied across the valley, including urban, suburban, and wildland areas and terrestrial and aquatic ecosystems. Ultimately, the framework will underpin the development of strategies that can be implemented to bolster ecological resilience from a site to landscape scale.
An integrative health information systems approach for facilitating strategic planning in hospitals.
Killingsworth, Brenda; Newkirk, Henry E; Seeman, Elaine
2006-01-01
This article presents a framework for developing strategic information systems (SISs) for hospitals. It proposes a SIS formulation process which incorporates complexity theory, strategic/organizational analysis theory, and conventional MIS development concepts. Within the formulation process, four dimensions of SIS are proposed as well as an implementation plan. A major contribution of this article is the development of a hospital SIS framework which permits an organization to fluidly respond to external, interorganizational, and intraorganizational influences. In addition, this article offers a checklist which managers can utilize in developing an SIS in health care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.
2004-12-01
The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.
Olowoporoku, Dotun; Hayes, Enda; Longhurst, James; Parkhurst, Graham
2012-06-30
Regardless of its intent and purposes, the first decade of the Local Air Quality Management (LAQM) framework had little or no effect in reducing traffic-related air pollution in the UK. Apart from the impact of increased traffic volumes, the major factor attributed to this failure is that of policy disconnect between the process of diagnosing air pollution and its management, thereby limiting the capability of local authorities to control traffic-related sources of air pollution. Integrating air quality management into the Local Transport Plan (LTP) process therefore presents opportunities for enabling political will, funding and joined-up policy approach to reduce this limitation. However, despite the increased access to resources for air quality measures within the LTP process, there are local institutional, political and funding constraints which reduce the impact of these policy interventions on air quality management. This paper illustrate the policy implementation gaps between central government policy intentions and the local government process by providing evidence of the deprioritisation of air quality management compared to the other shared priorities in the LTP process. We draw conclusions on the policy and practice of integrating air quality management into transport planning. The evidence thereby indicate the need for a policy shift from a solely localised hotspot management approach, in which the LAQM framework operates, to a more holistic management of vehicular emissions within wider spatial administrative areas. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tao Tang; Tan Zhu; He Xu
China currently put forwards 'striving to build an environmentally friendly society' as one of the most important development goals. The land administration authorities are facing the challenge of effectively incorporating environment considerations into their planning system. This paper aims to investigate why and how Strategic Environmental Assessment (SEA) is enacted as an effective tool to integrate the environment into land-use planning during the construction process of an environmentally friendly society in China, and identify factors that influence the integration. It presents characteristics of the land-use planning system, and reviews the progress and current state of SEA in China. Results showmore » that SEA provides many benefits in promoting environmental considerations into the land-use planning process. The legal frameworks and operational procedures, in the context of land-use master planning SEA, are summarized and an assessment made of their effectiveness. Some barriers are highlighted through examination of the latest case studies, and several recommendations are presented to overcome these obstacles.« less
Asmall, Shaidah
2015-01-01
Background South Africa is facing a complex burden of disease arising from a combination of chronic infectious illness and non-communicable diseases. As the burden of chronic diseases (communicable and non-communicable) increases, providing affordable and effective care to the increasing numbers of chronic patients will be an immense challenge. Methods The framework recommended by the Medical Research Council of the United Kingdom for the development and evaluation of complex health interventions was used to conceptualise the intervention. The breakthrough series was utilised for the implementation process. These two frameworks were embedded within the clinical practice improvement model that served as the overarching framework for the development and implementation of the model. Results The Chronic Care Model was ideally suited to improve the facility component and patient experience; however, the deficiencies in other aspects of the health system building blocks necessitated a hybrid model. An integrated chronic disease management model using a health systems approach was initiated across 42 primary health care facilities. The interventions were implemented in a phased approach using learning sessions and action periods to introduce the planned and targeted changes. Conclusion The implementation of the integrated chronic disease management model is feasible at primary care in South Africa provided that systemic challenges and change management are addressed during the implementation process. PMID:26528101
Mahomed, Ozayr Haroon; Asmall, Shaidah
2015-01-01
South Africa is facing a complex burden of disease arising from a combination of chronic infectious illness and non-communicable diseases. As the burden of chronic diseases (communicable and non-communicable) increases, providing affordable and effective care to the increasing numbers of chronic patients will be an immense challenge. The framework recommended by the Medical Research Council of the United Kingdom for the development and evaluation of complex health interventions was used to conceptualise the intervention. The breakthrough series was utilised for the implementation process. These two frameworks were embedded within the clinical practice improvement model that served as the overarching framework for the development and implementation of the model. The Chronic Care Model was ideally suited to improve the facility component and patient experience; however, the deficiencies in other aspects of the health system building blocks necessitated a hybrid model. An integrated chronic disease management model using a health systems approach was initiated across 42 primary health care facilities. The interventions were implemented in a phased approach using learning sessions and action periods to introduce the planned and targeted changes. The implementation of the integrated chronic disease management model is feasible at primary care in South Africa provided that systemic challenges and change management are addressed during the implementation process.
A universal deep learning approach for modeling the flow of patients under different severities.
Jiang, Shancheng; Chin, Kwai-Sang; Tsui, Kwok L
2018-02-01
The Accident and Emergency Department (A&ED) is the frontline for providing emergency care in hospitals. Unfortunately, relative A&ED resources have failed to keep up with continuously increasing demand in recent years, which leads to overcrowding in A&ED. Knowing the fluctuation of patient arrival volume in advance is a significant premise to relieve this pressure. Based on this motivation, the objective of this study is to explore an integrated framework with high accuracy for predicting A&ED patient flow under different triage levels, by combining a novel feature selection process with deep neural networks. Administrative data is collected from an actual A&ED and categorized into five groups based on different triage levels. A genetic algorithm (GA)-based feature selection algorithm is improved and implemented as a pre-processing step for this time-series prediction problem, in order to explore key features affecting patient flow. In our improved GA, a fitness-based crossover is proposed to maintain the joint information of multiple features during iterative process, instead of traditional point-based crossover. Deep neural networks (DNN) is employed as the prediction model to utilize their universal adaptability and high flexibility. In the model-training process, the learning algorithm is well-configured based on a parallel stochastic gradient descent algorithm. Two effective regularization strategies are integrated in one DNN framework to avoid overfitting. All introduced hyper-parameters are optimized efficiently by grid-search in one pass. As for feature selection, our improved GA-based feature selection algorithm has outperformed a typical GA and four state-of-the-art feature selection algorithms (mRMR, SAFS, VIFR, and CFR). As for the prediction accuracy of proposed integrated framework, compared with other frequently used statistical models (GLM, seasonal-ARIMA, ARIMAX, and ANN) and modern machine models (SVM-RBF, SVM-linear, RF, and R-LASSO), the proposed integrated "DNN-I-GA" framework achieves higher prediction accuracy on both MAPE and RMSE metrics in pairwise comparisons. The contribution of our study is two-fold. Theoretically, the traditional GA-based feature selection process is improved to have less hyper-parameters and higher efficiency, and the joint information of multiple features is maintained by fitness-based crossover operator. The universal property of DNN is further enhanced by merging different regularization strategies. Practically, features selected by our improved GA can be used to acquire an underlying relationship between patient flows and input features. Predictive values are significant indicators of patients' demand and can be used by A&ED managers to make resource planning and allocation. High accuracy achieved by the present framework in different cases enhances the reliability of downstream decision makings. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Xu, Xiao-Shuang; Wang, Hong-Lv
2018-03-01
Departing from the formulas of cigarette products, synergized business framework is established on the basis of cross-enterprise synergies for tobacco leaf threshing and redrying through the introduction of batch management, remote quality data sharing and consistent processes, among others. Functions of the business framework are achieved and a platform for synergies is erected by applying IOT, cross-enterprise system integration and big data processing technologies, resulting in a new pattern for intensive interaction and synergies between China Tobacco Zhejiang (CTZ) and tobacco redrying plants for more delicate management of the redrying process, more interactive information flows and more stable tobacco strip quality.
NASA Astrophysics Data System (ADS)
Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang
2010-11-01
This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.
Development of a new approach to cumulative effects assessment: a northern river ecosystem example.
Dubé, Monique; Johnson, Brian; Dunn, Gary; Culp, Joseph; Cash, Kevin; Munkittrick, Kelly; Wong, Isaac; Hedley, Kathlene; Booty, William; Lam, David; Resler, Oskar; Storey, Alex
2006-02-01
If sustainable development of Canadian waters is to be achieved, a realistic and manageable framework is required for assessing cumulative effects. The objective of this paper is to describe an approach for aquatic cumulative effects assessment that was developed under the Northern Rivers Ecosystem Initiative. The approach is based on a review of existing monitoring practices in Canada and the presence of existing thresholds for aquatic ecosystem health assessments. It suggests that a sustainable framework is possible for cumulative effects assessment of Canadian waters that would result in integration of national indicators of aquatic health, integration of national initiatives (e.g., water quality index, environmental effects monitoring), and provide an avenue where long-term monitoring programs could be integrated with baseline and follow-up monitoring conducted under the environmental assessment process.
A human-oriented framework for developing assistive service robots.
McGinn, Conor; Cullinan, Michael F; Culleton, Mark; Kelly, Kevin
2018-04-01
Multipurpose robots that can perform a range of useful tasks have the potential to increase the quality of life for many people living with disabilities. Owing to factors such as high system complexity, as-yet unresolved research questions and current technology limitations, there is a need for effective strategies to coordinate the development process. Integrating established methodologies based on human-centred design and universal design, a framework was formulated to coordinate the robot design process over successive iterations of prototype development. An account is given of how the framework was practically applied to the problem of developing a personal service robot. Application of the framework led to the formation of several design goals which addressed a wide range of identified user needs. The resultant prototype solution, which consisted of several component elements, succeeded in demonstrating the performance stipulated by all of the proposed metrics. Application of the framework resulted in the development of a complex prototype that addressed many aspects of the functional and usability requirements of a personal service robot. Following the process led to several important insights which directly benefit the development of subsequent prototypes. Implications for Rehabilitation This research shows how universal design might be used to formulate usability requirements for assistive service robots. A framework is presented that guides the process of designing service robots in a human-centred way. Through practical application of the framework, a prototype robot system that addressed a range of identified user needs was developed.
Colonius, Hans; Diederich, Adele
2011-07-01
The concept of a "time window of integration" holds that information from different sensory modalities must not be perceived too far apart in time in order to be integrated into a multisensory perceptual event. Empirical estimates of window width differ widely, however, ranging from 40 to 600 ms depending on context and experimental paradigm. Searching for theoretical derivation of window width, Colonius and Diederich (Front Integr Neurosci 2010) developed a decision-theoretic framework using a decision rule that is based on the prior probability of a common source, the likelihood of temporal disparities between the unimodal signals, and the payoff for making right or wrong decisions. Here, this framework is extended to the focused attention task where subjects are asked to respond to signals from a target modality only. Evoking the framework of the time-window-of-integration (TWIN) model, an explicit expression for optimal window width is obtained. The approach is probed on two published focused attention studies. The first is a saccadic reaction time study assessing the efficiency with which multisensory integration varies as a function of aging. Although the window widths for young and older adults differ by nearly 200 ms, presumably due to their different peripheral processing speeds, neither of them deviates significantly from the optimal values. In the second study, head saccadic reactions times to a perfectly aligned audiovisual stimulus pair had been shown to depend on the prior probability of spatial alignment. Intriguingly, they reflected the magnitude of the time-window widths predicted by our decision-theoretic framework, i.e., a larger time window is associated with a higher prior probability.
Genomic perspectives in microbial oceanography.
DeLong, Edward F; Karl, David M
2005-09-15
The global ocean is an integrated living system where energy and matter transformations are governed by interdependent physical, chemical and biotic processes. Although the fundamentals of ocean physics and chemistry are well established, comprehensive approaches to describing and interpreting oceanic microbial diversity and processes are only now emerging. In particular, the application of genomics to problems in microbial oceanography is significantly expanding our understanding of marine microbial evolution, metabolism and ecology. Integration of these new genome-enabled insights into the broader framework of ocean science represents one of the great contemporary challenges for microbial oceanographers.
NASA Astrophysics Data System (ADS)
Bongartz, K.; Flügel, W. A.
2003-04-01
In the joint research project “Development of an integrated methodology for the sustainable management of river basins The Saale River Basin example”, coordinated by the Centre of Environmental Research (UFZ), concepts and tools for an integrated management of large river basins are developed and applied for the Saale river basin. The ultimate objective of the project is to contribute to the holistic assessment and benchmarking approaches in water resource planning, as required by the European Water Framework Directive. The study presented here deals (1) with the development of a river basin information and modelling system, (2) with the refinement of a regionalisation approach adapted for integrated basin modelling. The approach combines a user friendly basin disaggregation method preserving the catchment’s physiographic heterogeneity with a process oriented hydrological basin assessment for scale bridging integrated modelling. The well tested regional distribution concept of Response Units (RUs) will be enhanced by landscape metrics and decision support tools for objective, scale independent and problem oriented RU delineation to provide the spatial modelling entities for process oriented and distributed simulation of vertical and lateral hydrological transport processes. On basis of this RUs suitable hydrological modelling approaches will be further developed with strong respect to a more detailed simulation of the lateral surface and subsurface flows as well as the channel flow. This methodical enhancement of the well recognised RU-concept will be applied to the river basin of the Saale (Ac: 23 179 km2) and validated by a nested catchment approach, which allows multi-response-validation and estimation of uncertainties of the modelling results. Integrated modelling of such a complex basin strongly influenced by manifold human activities (reservoirs, agriculture, urban areas and industry) can only be achieved by coupling the various modelling approaches within a well defined model framework system. The latter is interactively linked with a sophisticated geo-relational database (DB) serving all research teams involved in the project. This interactive linkage is a core element comprising an object-oriented, internet-based modelling framework system (MFS) for building interdisciplinary modelling applications and offering different analysis and visualisation tools.
Maximizing your Process Improvement ROI through Harmonization
2008-03-01
ISO 12207 ) provide comprehensive guidance on what system and software engineering processes are needed. The frameworks of Six Sigma provide specific...reductions. Their veloci-Q Enterprise integrated system, includes ISO 9001, CMM, P-CMM, TL9000, British Standard 7799, and Six Sigma. They estimate a 30...at their discretion. And, they chose to blend process maturity models and ISO standards to support their objective regarding the establishment of
Integrating Data Clustering and Visualization for the Analysis of 3D Gene Expression Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Data Analysis and Visualization; nternational Research Training Group ``Visualization of Large and Unstructured Data Sets,'' University of Kaiserslautern, Germany; Computational Research Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720, USA
2008-05-12
The recent development of methods for extracting precise measurements of spatial gene expression patterns from three-dimensional (3D) image data opens the way for new analyses of the complex gene regulatory networks controlling animal development. We present an integrated visualization and analysis framework that supports user-guided data clustering to aid exploration of these new complex datasets. The interplay of data visualization and clustering-based data classification leads to improved visualization and enables a more detailed analysis than previously possible. We discuss (i) integration of data clustering and visualization into one framework; (ii) application of data clustering to 3D gene expression data; (iii)more » evaluation of the number of clusters k in the context of 3D gene expression clustering; and (iv) improvement of overall analysis quality via dedicated post-processing of clustering results based on visualization. We discuss the use of this framework to objectively define spatial pattern boundaries and temporal profiles of genes and to analyze how mRNA patterns are controlled by their regulatory transcription factors.« less
Satellites, tweets, forecasts: the future of flood disaster management?
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Lorini, Valerio; Wania, Annett; Pappenberger, Florian; Salamon, Peter; Ramos, Maria Helena; Cloke, Hannah; Castillo, Carlos
2017-04-01
Floods have devastating effects on lives and livelihoods around the world. Structural flood defence measures such as dikes and dams can help protect people. However, it is the emerging science and technologies for flood disaster management and preparedness, such as increasingly accurate flood forecasting systems, high-resolution satellite monitoring, rapid risk mapping, and the unique strength of social media information and crowdsourcing, that are most promising for reducing the impacts of flooding. Here, we describe an innovative framework which integrates in real-time two components of the Copernicus Emergency mapping services, namely the European Flood Awareness System and the satellite-based Rapid Mapping, with new procedures for rapid risk assessment and social media and news monitoring. The integrated framework enables improved flood impact forecast, thanks to the real-time integration of forecasting and monitoring components, and increases the timeliness and efficiency of satellite mapping, with the aim of capturing flood peaks and following the evolution of flooding processes. Thanks to the proposed framework, emergency responders will have access to a broad range of timely and accurate information for more effective and robust planning, decision-making, and resource allocation.
A systematic review of socio-economic assessments in support of coastal zone management (1992-2011).
Le Gentil, Eric; Mongruel, Rémi
2015-02-01
Cooperation between the social and natural sciences has become essential in order to encompass all the dimensions of coastal zone management. Socio-economic approaches are increasingly recommended to complement integrated assessment in support of these initiatives. A systematic review of the academic literature was carried out in order to analyze the main types of socio-economic assessments used to inform the coastal zone management process as well as their effectiveness. A corpus of 1682 articles published between 1992 and 2011 was identified by means of the representative coverage approach, from which 170 were selected by applying inclusion/exclusion criteria and then classified using a content analysis methodology. The percentage of articles that mention the use of socio-economic assessment in support of coastal zone management initiatives is increasing but remains relatively low. The review examines the links between the issues addressed by integrated assessments and the chosen analytical frameworks as well as the various economic assessment methods which are used in the successive steps of the coastal zone management process. The results show that i) analytical frameworks such as 'risk and vulnerability', 'DPSIR', 'valuation', 'ecosystem services' and 'preferences' are likely to lead to effective integration of social sciences in coastal zone management research while 'integration', 'sustainability' and 'participation' remain difficult to operationalize, ii) risk assessments are insufficiently implemented in developing countries, and iii) indicator systems in support of multi-criteria analyses could be used during more stages of the coastal zone management process. Finally, it is suggested that improved collaboration between science and management would require that scientists currently involved in coastal zone management processes further educate themselves in integrated assessment approaches and participatory methodologies. Copyright © 2014 Elsevier Ltd. All rights reserved.
Caring as an Imperative for Nursing Education.
ERIC Educational Resources Information Center
Cook, Patricia R.; Cullen, Janice A.
2003-01-01
An associate nursing degree program threads caring across the curriculum using Watson's framework of interpersonal/transpersonal processes for caring and a taxonomy of affective competencies. Ways of caring are integrated into classroom and clinical experiences. (Contains 20 references.) (SK)
A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1998-01-01
This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.
Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2010-01-01
Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933
Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2011-07-01
Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.
Metacognition and evidence analysis instruction: an educational framework and practical experience.
Parrott, J Scott; Rubinstein, Matthew L
2015-08-21
The role of metacognitive skills in the evidence analysis process has received little attention in the research literature. While the steps of the evidence analysis process are well defined, the role of higher-level cognitive operations (metacognitive strategies) in integrating the steps of the process is not well understood. In part, this is because it is not clear where and how metacognition is implicated in the evidence analysis process nor how these skills might be taught. The purposes of this paper are to (a) suggest a model for identifying critical thinking and metacognitive skills in evidence analysis instruction grounded in current educational theory and research and (b) demonstrate how freely available systematic review/meta-analysis tools can be used to focus on higher-order metacognitive skills, while providing a framework for addressing common student weaknesses. The final goal of this paper is to provide an instructional framework that can generate critique and elaboration while providing the conceptual basis and rationale for future research agendas on this topic.
Integrating UIMA annotators in a web-based text processing framework.
Chen, Xiang; Arnold, Corey W
2013-01-01
The Unstructured Information Management Architecture (UIMA) [1] framework is a growing platform for natural language processing (NLP) applications. However, such applications may be difficult for non-technical users deploy. This project presents a web-based framework that wraps UIMA-based annotator systems into a graphical user interface for researchers and clinicians, and a web service for developers. An annotator that extracts data elements from lung cancer radiology reports is presented to illustrate the use of the system. Annotation results from the web system can be exported to multiple formats for users to utilize in other aspects of their research and workflow. This project demonstrates the benefits of a lay-user interface for complex NLP applications. Efforts such as this can lead to increased interest and support for NLP work in the clinical domain.
A Framework for Corporate Strategic Planning: Philosophy, Process, and Practice. Paper P-97.
ERIC Educational Resources Information Center
Amara, Roy
The objective of this booklet is to present an integrated picture of the philosophy, process, and practices of strategic planning in an organizational context. It is based on the premise that planning includes the design of a desired future as well as effective ways of bringing it about. Specifically, the document illustrates a planning…
Integrated Tales of Policies, Teaching and Teacher Education: Reflecting on an Ongoing Process
ERIC Educational Resources Information Center
Reddy, C.
2009-01-01
Changing times in teacher education has been a long mantra and many changes have been occurring globally in this sector of higher education. In South Africa teacher education change has been linked to changes in the broader education processes and includes policy changes and the development of regulatory frameworks which all impacted on practice…
Nitrogen Oxides (NOx) Primary NAAQS REVIEW: Integrated ...
The NOx Integrated Review Plan is the first document generated as part of the National Ambient Air Quality Standards (NAAQS) review process. The Plan presents background information, the schedule for the review, the process to be used in conducting the review, and the key policy-relevant science issues that will guide the review. The integrated review plan also discusses the frameworks for the various assessment documents to be prepared by the EPA as part of the review, including an Integrated Science Assessment (ISA), and as warranted, a Risk/Exposure Assessment (REA), and a Policy Assessment (PA). The primary purpose of the NOx Integrated Review Plan is to highlight the key policy-relevant issues to be considered in the Review of the NO2 primary NAAQS. A draft of the integrated review plan will be the subject of an advisory review with the Clean Air Scientific Advisory Committee (CASAC) and made available to the public for review and comment.
Sulfur Dioxide (SO2) Primary NAAQS Review: Integrated ...
The SO2 Integrated Review Plan is the first document generated as part of the National Ambient Air Quality Standards (NAAQS) review process. The Plan presents background information, the schedule for the review, the process to be used in conducting the review, and the key policy-relevant science issues that will guide the review. The integrated review plan also discusses the frameworks for the various assessment documents to be prepared by the EPA as part of the review, including an Integrated Science Assessment (ISA), and as warranted, a Risk/Exposure Assessment (REA), and a Policy Assessment (PA). The primary purpose of the SO2 Integrated Review Plan is to highlight the key policy-relevant issues to be considered in the Review of the SO2 primary NAAQS. A draft of the integrated review plan will be the subject of an advisory with the Clean Air Scientific Advisory Committee (CASAC) and made available to the public for review and comment.
Cheng, Ching-Min; Hwang, Sheue-Ling
2015-03-01
This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
Concurrent enterprise: a conceptual framework for enterprise supply-chain network activities
NASA Astrophysics Data System (ADS)
Addo-Tenkorang, Richard; Helo, Petri T.; Kantola, Jussi
2017-04-01
Supply-chain management (SCM) in manufacturing industries has evolved significantly over the years. Recently, a lot more relevant research has picked up on the development of integrated solutions. Thus, seeking a collaborative optimisation of geographical, just-in-time (JIT), quality (customer demand/satisfaction) and return-on-investment (profits), aspects of organisational management and planning through 'best practice' business-process management - concepts and application; employing system tools such as certain applications/aspects of enterprise resource planning (ERP) - SCM systems information technology (IT) enablers to enhance enterprise integrated product development/concurrent engineering principles. This article assumed three main organisation theory applications in positioning its assumptions. Thus, proposing a feasible industry-specific framework not currently included within the SCOR model's level four (4) implementation level, as well as other existing SCM integration reference models such as in the MIT process handbook's - Process Interchange Format (PIF), the TOVE project, etc. which could also be replicated in other SCs. However, the wider focus of this paper's contribution will be concentrated on a complimentary proposed framework to the SCC's SCOR reference model. Quantitative empirical closed-ended questionnaires in addition to the main data collected from a qualitative empirical real-life industrial-based pilot case study were used: To propose a conceptual concurrent enterprise framework for SCM network activities. This research adopts a design structure matrix simulation approach analysis to propose an optimal enterprise SCM-networked value-adding, customised master data-management platform/portal for efficient SCM network information exchange and an effective supply-chain (SC) network systems-design teams' structure. Furthermore, social network theory analysis will be employed in a triangulation approach with statistical correlation analysis to assess the scale/level of frequency, importance, level of collaborative-ness, mutual trust as well as roles and responsibility among the enterprise SCM network for systems product development (PD) design teams' technical communication network as well as extensive literature reviews.
Shaikh, Nusratnaaz M; Kersten, Paula; Siegert, Richard J; Theadom, Alice
2018-03-06
Despite increasing emphasis on the importance of community integration as an outcome for acquired brain injury (ABI), there is still no consensus on the definition of community integration. The aim of this study was to complete a concept analysis of community integration in people with ABI. The method of concept clarification was used to guide concept analysis of community integration based on a literature review. Articles were included if they explored community integration in people with ABI. Data extraction was performed by the initial coding of (1) the definition of community integration used in the articles, (2) attributes of community integration recognized in the articles' findings, and (3) the process of community integration. This information was synthesized to develop a model of community integration. Thirty-three articles were identified that met the inclusion criteria. The construct of community integration was found to be a non-linear process reflecting recovery over time, sequential goals, and transitions. Community integration was found to encompass six components including: independence, sense of belonging, adjustment, having a place to live, involved in a meaningful occupational activity, and being socially connected into the community. Antecedents to community integration included individual, injury-related, environmental, and societal factors. The findings of this concept analysis suggest that the concept of community integration is more diverse than previously recognized. New measures and rehabilitation plans capturing all attributes of community integration are needed in clinical practice. Implications for rehabilitation Understanding of perceptions and lived experiences of people with acquired brain injury through this analysis provides basis to ensure rehabilitation meets patients' needs. This model highlights the need for clinicians to be aware and assess the role of antecedents as well as the attributes of community integration itself to ensure all aspects are addressed in in a manner that will enhance the recovery and improve the level of integration into the community. The finding that community integration is a non-linear process also highlights the need for rehabilitation professionals to review and revise plans over time in response to a person's changing circumstances and recovery journey. This analysis provides the groundwork for an operational model of community integration for the development of a measure of community integration that assesses all six attributes revealed in this review not recognized in previous frameworks.
Chanda, Emmanuel; Ameneshewa, Birkinesh; Mihreteab, Selam; Berhane, Araia; Zehaie, Assefash; Ghebrat, Yohannes; Usman, Abdulmumini
2015-12-02
Contemporary malaria vector control relies on the use of insecticide-based, indoor residual spraying (IRS) and long-lasting insecticidal nets (LLINs). However, malaria-endemic countries, including Eritrea, have struggled to effectively deploy these tools due technical and operational challenges, including the selection of insecticide resistance in malaria vectors. This manuscript outlines the processes undertaken in consolidating strategic planning and operational frameworks for vector control to expedite malaria elimination in Eritrea. The effort to strengthen strategic frameworks for vector control in Eritrea was the 'case' for this study. The integrated vector management (IVM) strategy was developed in 2010 but was not well executed, resulting in a rise in malaria transmission, prompting a process to redefine and relaunch the IVM strategy with integration of other vector borne diseases (VBDs) as the focus. The information sources for this study included all available data and accessible archived documentary records on malaria vector control in Eritrea. Structured literature searches of published, peer-reviewed sources using online, scientific, bibliographic databases, Google Scholar, PubMed and WHO, and a combination of search terms were utilized to gather data. The literature was reviewed and adapted to the local context and translated into the consolidated strategic framework. In Eritrea, communities are grappling with the challenge of VBDs posing public health concerns, including malaria. The global fund financed the scale-up of IRS and LLIN programmes in 2014. Eritrea is transitioning towards malaria elimination and strategic frameworks for vector control have been consolidated by: developing an integrated vector management (IVM) strategy (2015-2019); updating IRS and larval source management (LSM) guidelines; developing training manuals for IRS and LSM; training of national staff in malaria entomology and vector control, including insecticide resistance monitoring techniques; initiating the global plan for insecticide resistance management; conducting needs' assessments and developing standard operating procedure for insectaries; developing a guidance document on malaria vector control based on eco-epidemiological strata, a vector surveillance plan and harmonized mapping, data collection and reporting tools. Eritrea has successfully consolidated strategic frameworks for vector control. Rational decision-making remains critical to ensure that the interventions are effective and their choice is evidence-based, and to optimize the use of resources for vector control. Implementation of effective IVM requires proper collaboration and coordination, consistent technical and financial capacity and support to offer greater benefits.
NASA Astrophysics Data System (ADS)
Gupta, H.; Liu, Y.; Wagener, T.; Durcik, M.; Duffy, C.; Springer, E.
2005-12-01
Water resources in arid and semi-arid regions are highly sensitive to climate variability and change. As the demand for water continues to increase due to economic and population growth, planning and management of available water resources under climate uncertainties becomes increasingly critical in order to achieve basin-scale water sustainability (i.e., to ensure a long-term balance between supply and demand of water).The tremendous complexity of the interactions between the natural hydrologic system and the human environment means that modeling is the only available mechanism for properly integrating new knowledge into the decision-making process. Basin-scale integrated models have the potential to allow us to study the feedback processes between the physical and human systems (including institutional, engineering, and behavioral components); and an integrated assessment of the potential second- and higher-order effects of political and management decisions can aid in the selection of a rational water-resources policy. Data and information, especially hydrological and water-use data, are critical to the integrated modeling and assessment for water resources management of any region. To this end we are in the process of developing a multi-resolution integrated modeling and assessment framework for the south-western USA, which can be used to generate simulations of the probable effects of human actions while taking into account the uncertainties brought about by future climatic variability and change. Data are being collected (including the development of a hydro-geospatial database) and used in support of the modeling and assessment activities. This paper will present a blueprint of the modeling framework, describe achievements so far and discuss the science questions which still require answers with a particular emphasis on issues related to dry regions.
NASA Astrophysics Data System (ADS)
Chin, A.; Simpfendorfer, C. A.; White, W. T.; Johnson, G. J.; McAuley, R. B.; Heupel, M. R.
2017-04-01
Conservation and management of migratory species can be complex and challenging. International agreements such as the Convention on Migratory Species (CMS) provide policy frameworks, but assessments and management can be hampered by lack of data and tractable mechanisms to integrate disparate datasets. An assessment of scalloped (Sphyrna lewini) and great (Sphyrna mokarran) hammerhead population structure and connectivity across northern Australia, Indonesia and Papua New Guinea (PNG) was conducted to inform management responses to CMS and Convention on International Trade in Endangered Species listings of these species. An Integrated Assessment Framework (IAF) was devised to systematically incorporate data across jurisdictions and create a regional synopsis, and amalgamated a suite of data from the Australasian region. Scalloped hammerhead populations are segregated by sex and size, with Australian populations dominated by juveniles and small adult males, while Indonesian and PNG populations included large adult females. The IAF process introduced genetic and tagging data to produce conceptual models of stock structure and movement. Several hypotheses were produced to explain stock structure and movement patterns, but more data are needed to identify the most likely hypothesis. This study demonstrates a process for assessing migratory species connectivity and highlights priority areas for hammerhead management and research.
Chin, A; Simpfendorfer, C A; White, W T; Johnson, G J; McAuley, R B; Heupel, M R
2017-04-21
Conservation and management of migratory species can be complex and challenging. International agreements such as the Convention on Migratory Species (CMS) provide policy frameworks, but assessments and management can be hampered by lack of data and tractable mechanisms to integrate disparate datasets. An assessment of scalloped (Sphyrna lewini) and great (Sphyrna mokarran) hammerhead population structure and connectivity across northern Australia, Indonesia and Papua New Guinea (PNG) was conducted to inform management responses to CMS and Convention on International Trade in Endangered Species listings of these species. An Integrated Assessment Framework (IAF) was devised to systematically incorporate data across jurisdictions and create a regional synopsis, and amalgamated a suite of data from the Australasian region. Scalloped hammerhead populations are segregated by sex and size, with Australian populations dominated by juveniles and small adult males, while Indonesian and PNG populations included large adult females. The IAF process introduced genetic and tagging data to produce conceptual models of stock structure and movement. Several hypotheses were produced to explain stock structure and movement patterns, but more data are needed to identify the most likely hypothesis. This study demonstrates a process for assessing migratory species connectivity and highlights priority areas for hammerhead management and research.
NASA Astrophysics Data System (ADS)
Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi
2015-07-01
In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl- + CH3Cl → ClCH3 + Cl-) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.
Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi
2015-07-07
In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl(-) + CH3Cl → ClCH3 + Cl(-)) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.
INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorensek, M.; Hamm, L.; Garcia, H.
2011-07-18
Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less
Ecological Principles for Invasive Plant Management
USDA-ARS?s Scientific Manuscript database
Invasive annual grasses continue to advance at an alarming rate despite efforts of control by land managers. Ecologically-based invasive plant management (EBIPM) is a holistic framework that integrates ecosystem health assessment, knowledge of ecological processes and adaptive management into a succ...
NASA Astrophysics Data System (ADS)
Parshin, D. A.; Manzhirov, A. V.
2018-04-01
Quasistatic mechanical problems on additive manufacturing aging viscoelastic solids are investigated. The processes of piecewise-continuous accretion of such solids are considered. The consideration is carried out in the framework of linear mechanics of growing solids. A theorem about commutativity of the integration over an arbitrary surface increasing in the solid growing process and the time-derived integral operator of viscoelasticity with a limit depending on the solid point is proved. This theorem provides an efficient way to construct on the basis of Saint-Venant principle solutions of nonclassical boundary-value problems for describing the mechanical behaviour of additively formed solids with integral satisfaction of boundary conditions on the surfaces expanding due to the additional material influx to the formed solid. The constructed solutions will retrace the evolution of the stress-strain state of the solids under consideration during and after the processes of their additive formation. An example of applying the proved theorem is given.
The Fundamentals of Care Framework as a Point-of-Care Nursing Theory.
Kitson, Alison L
Nursing theories have attempted to shape the everyday practice of clinical nurses and patient care. However, many theories-because of their level of abstraction and distance from everyday caring activity-have failed to help nurses undertake the routine practical aspects of nursing care in a theoretically informed way. The purpose of the paper is to present a point-of-care theoretical framework, called the fundamentals of care (FOC) framework, which explains, guides, and potentially predicts the quality of care nurses provide to patients, their carers, and family members. The theoretical framework is presented: person-centered fundamental care (PCFC)-the outcome for the patient and the nurse and the goal of the FOC framework are achieved through the active management of the practice process, which involves the nurse and the patient working together to integrate three core dimensions: establishing the nurse-patient relationship, integrating the FOC into the patient's care plan, and ensuring that the setting or context where care is transacted and coordinated is conducive to achieving PCFC outcomes. Each dimension has multiple elements and subelements, which require unique assessment for each nurse-patient encounter. The FOC framework is presented along with two scenarios to demonstrate its usefulness. The dimensions, elements, and subelements are described, and next steps in the development are articulated.
Ecosystem Services and Climate Change Considerations for ...
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework “iemWatersheds” has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water
Framework for Integrating Science Data Processing Algorithms Into Process Control Systems
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Crichton, Daniel J.; Chang, Albert Y.; Foster, Brian M.; Freeborn, Dana J.; Woollard, David M.; Ramirez, Paul M.
2011-01-01
A software framework called PCS Task Wrapper is responsible for standardizing the setup, process initiation, execution, and file management tasks surrounding the execution of science data algorithms, which are referred to by NASA as Product Generation Executives (PGEs). PGEs codify a scientific algorithm, some step in the overall scientific process involved in a mission science workflow. The PCS Task Wrapper provides a stable operating environment to the underlying PGE during its execution lifecycle. If the PGE requires a file, or metadata regarding the file, the PCS Task Wrapper is responsible for delivering that information to the PGE in a manner that meets its requirements. If the PGE requires knowledge of upstream or downstream PGEs in a sequence of executions, that information is also made available. Finally, if information regarding disk space, or node information such as CPU availability, etc., is required, the PCS Task Wrapper provides this information to the underlying PGE. After this information is collected, the PGE is executed, and its output Product file and Metadata generation is managed via the PCS Task Wrapper framework. The innovation is responsible for marshalling output Products and Metadata back to a PCS File Management component for use in downstream data processing and pedigree. In support of this, the PCS Task Wrapper leverages the PCS Crawler Framework to ingest (during pipeline processing) the output Product files and Metadata produced by the PGE. The architectural components of the PCS Task Wrapper framework include PGE Task Instance, PGE Config File Builder, Config File Property Adder, Science PGE Config File Writer, and PCS Met file Writer. This innovative framework is really the unifying bridge between the execution of a step in the overall processing pipeline, and the available PCS component services as well as the information that they collectively manage.
Rütten, A; Wolff, A; Streber, A
2016-03-01
This article discusses 2 current issues in the field of public health research: (i) transfer of scientific knowledge into practice and (ii) sustainable implementation of good practice projects. It also supports integration of scientific and practice-based evidence production. Furthermore, it supports utilisation of interactive models that transcend deductive approaches to the process of knowledge transfer. Existing theoretical approaches, pilot studies and thoughtful conceptual considerations are incorporated into a framework showing the interplay of science, politics and prevention practice, which fosters a more sustainable implementation of health promotion programmes. The framework depicts 4 key processes of interaction between science and prevention practice: interactive knowledge to action, capacity building, programme adaptation and adaptation of the implementation context. Ensuring sustainability of health promotion programmes requires a concentrated process of integrating scientific and practice-based evidence production in the context of implementation. Central to the integration process is the approach of interactive knowledge to action, which especially benefits from capacity building processes that facilitate participation and systematic interaction between relevant stakeholders. Intense cooperation also induces a dynamic interaction between multiple actors and components such as health promotion programmes, target groups, relevant organisations and social, cultural and political contexts. The reciprocal adaptation of programmes and key components of the implementation context can foster effectiveness and sustainability of programmes. Sustainable implementation of evidence-based health promotion programmes requires alternatives to recent deductive models of knowledge transfer. Interactive approaches prove to be promising alternatives. Simultaneously, they change the responsibilities of science, policy and public health practice. Existing boundaries within disciplines and sectors are overcome by arranging transdisciplinary teams as well as by developing common agendas and procedures. Such approaches also require adaptations of the structure of research projects such as extending the length of funding. © Georg Thieme Verlag KG Stuttgart · New York.
Mood disorders: neurocognitive models.
Malhi, Gin S; Byrow, Yulisha; Fritz, Kristina; Das, Pritha; Baune, Bernhard T; Porter, Richard J; Outhred, Tim
2015-12-01
In recent years, a number of neurocognitive models stemming from psychiatry and psychology schools of thought have conceptualized the pathophysiology of mood disorders in terms of dysfunctional neural mechanisms that underpin and drive neurocognitive processes. Though these models have been useful for advancing our theoretical understanding and facilitating important lines of research, translation of these models and their application within the clinical arena have been limited-partly because of lack of integration and synthesis. Cognitive neuroscience provides a novel perspective for understanding and modeling mood disorders. This selective review of influential neurocognitive models develops an integrative approach that can serve as a template for future research and the development of a clinically meaningful framework for investigating, diagnosing, and treating mood disorders. A selective literature search was conducted using PubMed and PsychINFO to identify prominent neurobiological and neurocognitive models of mood disorders. Most models identify similar neural networks and brain regions and neuropsychological processes in the neurocognition of mood, however, they differ in terms of specific functions attached to neural processes and how these interact. Furthermore, cognitive biases, reward processing and motivation, rumination, and mood stability, which play significant roles in the manner in which attention, appraisal, and response processes are deployed in mood disorders, are not sufficiently integrated. The inclusion of interactions between these additional components enhances our understanding of the etiology and pathophysiology of mood disorders. Through integration of key cognitive functions and understanding of how these interface with neural functioning within neurocognitive models of mood disorders, a framework for research can be created for translation to diagnosis and treatment of mood disorders. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A Hybrid Constraint Representation and Reasoning Framework
NASA Technical Reports Server (NTRS)
Golden, Keith; Pang, Wanlin
2004-01-01
In this paper, we introduce JNET, a novel constraint representation and reasoning framework that supports procedural constraints and constraint attachments, providing a flexible way of integrating the constraint system with a runtime software environment and improving its applicability. We describe how JNET is applied to a real-world problem - NASA's Earth-science data processing domain, and demonstrate how JNET can be extended, without any knowledge of how it is implemented, to meet the growing demands of real-world applications.
Friendly Extensible Transfer Tool Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.
2016-04-15
Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).
Project management techniques for highly integrated programs
NASA Technical Reports Server (NTRS)
Stewart, J. F.; Bauer, C. A.
1983-01-01
The management and control of a representative, highly integrated high-technology project, in the X-29A aircraft flight test project is addressed. The X-29A research aircraft required the development and integration of eight distinct technologies in one aircraft. The project management system developed for the X-29A flight test program focuses on the dynamic interactions and the the intercommunication among components of the system. The insights gained from the new conceptual framework permitted subordination of departments to more functional units of decisionmaking, information processing, and communication networks. These processes were used to develop a project management system for the X-29A around the information flows that minimized the effects inherent in sampled-data systems and exploited the closed-loop multivariable nature of highly integrated projects.
Service Contract Compliance Management in Business Process Management
NASA Astrophysics Data System (ADS)
El Kharbili, Marwane; Pulvermueller, Elke
Compliance management is a critical concern for corporations, required to respect contracts. This concern is particularly relevant in the context of business process management (BPM) as this paradigm is getting adopted more widely for-designing and building IT systems. Enforcing contractual compliance needs to be modeled at different levels of a BPM framework, which also includes the service layer. In this paper, we discuss requirements and methods for modeling contractual compliance for an SOA-supported BPM. We also show how business rule management integrated into an industry BPM tool allows modeling and processing functional and non-functional-property constraints which may be extracted from business process contracts. This work proposes a framework that responds to the requirements identified and proposes an architecture implementing it. Our approach is also illustrated by an example.
What kind of computation is intelligence. A framework for integrating different kinds of expertise
NASA Technical Reports Server (NTRS)
Chandrasekaran, B.
1989-01-01
The view that the deliberative aspect of intelligent behavior is a distinct type of algorithm; in particular, a goal-seeking exploratory process using qualitative representations of knowledge and inference is elaborated. There are other kinds of algorithms that also embody expertise in domains. The different types of expertise and how they can and should be integrated to give full account of expert behavior are discussed.
AI/OR computational model for integrating qualitative and quantitative design methods
NASA Technical Reports Server (NTRS)
Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor
1990-01-01
A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.
Using Business Process Specification and Agent to Integrate a Scenario Driven Supply Chain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, Hyunbo; Kulvatunyou, Boonserm; Jeong, Hanil
2004-07-01
In today's increasingly competitive global market, most enterprises place high priority on reducing order-fulfillment costs, minimizing time-to-market, and maximizing product quality. The desire of businesses to achieve these goals has seen a shift from a make-to-stock paradigm to a make-to-order paradigm. The success of this new paradigm requires robust and efficient supply chain integration and the ability to operate in the business-to-business (B2B) environment. Recent internet-based approaches have enabled instantaneous and secure information sharing among trading partners (i.e., customers, manufacturers, and suppliers). In this paper, we present a framework that enables both integration and B2B operations. This framework uses pre-definedmore » business process specifications (BPS) and agent technologies. The BPS, which specifies a message choreography among the trading partners, is modeled using a modified Unified Modeling Language (UML). The behavior of the enterprise applications within each trading partner -- how they respond to external events specified in the BPS -- is modeled using Petri-nets and implemented as a collection of agents. The concepts and models proposed in this paper should provide the starting point for the formulation of a structured approach to B2B supply chain integration and implementation.« less
Jimenez-Molina, Angel; Gaete-Villegas, Jorge; Fuentes, Javier
2018-06-01
New advances in telemedicine, ubiquitous computing, and artificial intelligence have supported the emergence of more advanced applications and support systems for chronic patients. This trend addresses the important problem of chronic illnesses, highlighted by multiple international organizations as a core issue in future healthcare. Despite the myriad of exciting new developments, each application and system is designed and implemented for specific purposes and lacks the flexibility to support different healthcare concerns. Some of the known problems of such developments are the integration issues between applications and existing healthcare systems, the reusability of technical knowledge in the creation of new and more sophisticated systems and the usage of data gathered from multiple sources in the generation of new knowledge. This paper proposes a framework for the development of chronic disease support systems and applications as an answer to these shortcomings. Through this framework our pursuit is to create a common ground methodology upon which new developments can be created and easily integrated to provide better support to chronic patients, medical staff and other relevant participants. General requirements are inferred for any support system from the primary attention process of chronic patients by the Business Process Management Notation. Numerous technical approaches are proposed to design a general architecture that considers the medical organizational requirements in the treatment of a patient. A framework is presented for any application in support of chronic patients and evaluated by a case study to test the applicability and pertinence of the solution. Copyright © 2018 Elsevier Inc. All rights reserved.
Critical appraisal of rigour in interpretive phenomenological nursing research.
de Witt, Lorna; Ploeg, Jenny
2006-07-01
This paper reports a critical review of published nursing research for expressions of rigour in interpretive phenomenology, and a new framework of rigour specific to this methodology is proposed. The rigour of interpretive phenomenology is an important nursing research methods issue that has direct implications for the legitimacy of nursing science. The use of a generic set of qualitative criteria of rigour for interpretive phenomenological studies is problematic because it is philosophically inconsistent with the methodology and creates obstacles to full expression of rigour in such studies. A critical review was conducted of the published theoretical interpretive phenomenological nursing literature from 1994 to 2004 and the expressions of rigour in this literature identified. We used three sources to inform the derivation of a proposed framework of expressions of rigour for interpretive phenomenology: the phenomenological scholar van Manen, the theoretical interpretive phenomenological nursing literature, and Madison's criteria of rigour for hermeneutic phenomenology. The nursing literature reveals a broad range of criteria for judging the rigour of interpretive phenomenological research. The proposed framework for evaluating rigour in this kind of research contains the following five expressions: balanced integration, openness, concreteness, resonance, and actualization. Balanced integration refers to the intertwining of philosophical concepts in the study methods and findings and a balance between the voices of study participants and the philosophical explanation. Openness is related to a systematic, explicit process of accounting for the multiple decisions made throughout the study process. Concreteness relates to usefulness for practice of study findings. Resonance encompasses the experiential or felt effect of reading study findings upon the reader. Finally, actualization refers to the future realization of the resonance of study findings. Adoption of this or similar frameworks of expressions of rigour could help to preserve the integrity and legitimacy of interpretive phenomenological nursing research.
Integrated Technology Assessment Center (ITAC) Update
NASA Technical Reports Server (NTRS)
Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)
2002-01-01
The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.
TRENCADIS--a WSRF grid MiddleWare for managing DICOM structured reporting objects.
Blanquer, Ignacio; Hernandez, Vicente; Segrelles, Damià
2006-01-01
The adoption of the digital processing of medical data, especially on radiology, has leaded to the availability of millions of records (images and reports). However, this information is mainly used at patient level, being the extraction of information, organised according to administrative criteria, which make the extraction of knowledge difficult. Moreover, legal constraints make the direct integration of information systems complex or even impossible. On the other side, the widespread of the DICOM format has leaded to the inclusion of other information different from just radiological images. The possibility of coding radiology reports in a structured form, adding semantic information about the data contained in the DICOM objects, eases the process of structuring images according to content. DICOM Structured Reporting (DICOM-SR) is a specification of tags and sections to code and integrate radiology reports, with seamless references to findings and regions of interests of the associated images, movies, waveforms, signals, etc. The work presented in this paper aims at developing of a framework to efficiently and securely share medical images and radiology reports, as well as to provide high throughput processing services. This system is based on a previously developed architecture in the framework of the TRENCADIS project, and uses other components such as the security system and the Grid processing service developed in previous activities. The work presented here introduces a semantic structuring and an ontology framework, to organise medical images considering standard terminology and disease coding formats (SNOMED, ICD9, LOINC..).
Choueri, R B; Cesar, A; Abessa, D M S; Torres, R J; Riba, I; Pereira, C D S; Nascimento, M R L; Morais, R D; Mozeto, A A; DelValls, T A
2010-04-01
This paper presents a harmonised framework of sediment quality assessment and dredging material characterisation for estuaries and port zones of North and South Atlantic. This framework, based on the weight-of-evidence approach, provides a structure and a process for conducting sediment/dredging material assessment that leads to a decision. The main structure consists of "step 1" (examination of available data); "step 2" (chemical characterisation and toxicity assessment); "decision 1" (any chemical level higher than reference values? are sediments toxic?); "step 3" (assessment of benthic community structure); "step 4" (integration of the results); "decision 2" (are sediments toxic or benthic community impaired?); "step 5" (construction of the decision matrix) and "decision 3" (is there environmental risk?). The sequence of assessments may be interrupted when the information obtained is judged to be sufficient for a correct characterisation of the risk posed by the sediments/dredging material. This framework brought novel features compared to other sediment/dredging material risk assessment frameworks: data integration through multivariate analysis allows the identification of which samples are toxic and/or related to impaired benthic communities; it also discriminates the chemicals responsible for negative biological effects; and the framework dispenses the use of a reference area. We demonstrated the successful application of this framework in different port and estuarine zones of the North (Gulf of Cádiz) and South Atlantic (Santos and Paranaguá Estuarine Systems).
Evaluation of the causal framework used for setting national ambient air quality standards.
Goodman, Julie E; Prueitt, Robyn L; Sax, Sonja N; Bailey, Lisa A; Rhomberg, Lorenz R
2013-11-01
Abstract A scientifically sound assessment of the potential hazards associated with a substance requires a systematic, objective and transparent evaluation of the weight of evidence (WoE) for causality of health effects. We critically evaluated the current WoE framework for causal determination used in the United States Environmental Protection Agency's (EPA's) assessments of the scientific data on air pollutants for the National Ambient Air Quality Standards (NAAQS) review process, including its methods for literature searches; study selection, evaluation and integration; and causal judgments. The causal framework used in recent NAAQS evaluations has many valuable features, but it could be more explicit in some cases, and some features are missing that should be included in every WoE evaluation. Because of this, it has not always been applied consistently in evaluations of causality, leading to conclusions that are not always supported by the overall WoE, as we demonstrate using EPA's ozone Integrated Science Assessment as a case study. We propose additions to the NAAQS causal framework based on best practices gleaned from a previously conducted survey of available WoE frameworks. A revision of the NAAQS causal framework so that it more closely aligns with these best practices and the full and consistent application of the framework will improve future assessments of the potential health effects of criteria air pollutants by making the assessments more thorough, transparent, and scientifically sound.
Telearch - Integrated visual simulation environment for collaborative virtual archaeology.
NASA Astrophysics Data System (ADS)
Kurillo, Gregorij; Forte, Maurizio
Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.
High-Performance Integrated Control of water quality and quantity in urban water reservoirs
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.; Goedbloed, A.
2015-11-01
This paper contributes a novel High-Performance Integrated Control framework to support the real-time operation of urban water supply storages affected by water quality problems. We use a 3-D, high-fidelity simulation model to predict the main water quality dynamics and inform a real-time controller based on Model Predictive Control. The integration of the simulation model into the control scheme is performed by a model reduction process that identifies a low-order, dynamic emulator running 4 orders of magnitude faster. The model reduction, which relies on a semiautomatic procedural approach integrating time series clustering and variable selection algorithms, generates a compact and physically meaningful emulator that can be coupled with the controller. The framework is used to design the hourly operation of Marina Reservoir, a 3.2 Mm3 storm-water-fed reservoir located in the center of Singapore, operated for drinking water supply and flood control. Because of its recent formation from a former estuary, the reservoir suffers from high salinity levels, whose behavior is modeled with Delft3D-FLOW. Results show that our control framework reduces the minimum salinity levels by nearly 40% and cuts the average annual deficit of drinking water supply by about 2 times the active storage of the reservoir (about 4% of the total annual demand).
NASA Technical Reports Server (NTRS)
Westmoreland, Sally; Stow, Douglas A.
1992-01-01
A framework is proposed for analyzing ancillary data and developing procedures for incorporating ancillary data to aid interactive identification of land-use categories in land-use updates. The procedures were developed for use within an integrated image processsing/geographic information systems (GIS) that permits simultaneous display of digital image data with the vector land-use data to be updated. With such systems and procedures, automated techniques are integrated with visual-based manual interpretation to exploit the capabilities of both. The procedural framework developed was applied as part of a case study to update a portion of the land-use layer in a regional scale GIS. About 75 percent of the area in the study site that experienced a change in land use was correctly labeled into 19 categories using the combination of automated and visual interpretation procedures developed in the study.
SU-G-JeP3-08: Robotic System for Ultrasound Tracking in Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhlemann, I; Graduate School for Computing in Medicine and Life Sciences, University of Luebeck; Jauer, P
Purpose: For safe and accurate real-time tracking of tumors for IGRT using 4D ultrasound, it is necessary to make use of novel, high-end force-sensitive lightweight robots designed for human-machine interaction. Such a robot will be integrated into an existing robotized ultrasound system for non-invasive 4D live tracking, using a newly developed real-time control and communication framework. Methods: The new KUKA LWR iiwa robot is used for robotized ultrasound real-time tumor tracking. Besides more precise probe contact pressure detection, this robot provides an additional 7th link, enhancing the dexterity of the kinematic and the mounted transducer. Several integrated, certified safety featuresmore » create a safe environment for the patients during treatment. However, to remotely control the robot for the ultrasound application, a real-time control and communication framework has to be developed. Based on a client/server concept, client-side control commands are received and processed by a central server unit and are implemented by a client module running directly on the robot’s controller. Several special functionalities for robotized ultrasound applications are integrated and the robot can now be used for real-time control of the image quality by adjusting the transducer position, and contact pressure. The framework was evaluated looking at overall real-time capability for communication and processing of three different standard commands. Results: Due to inherent, certified safety modules, the new robot ensures a safe environment for patients during tumor tracking. Furthermore, the developed framework shows overall real-time capability with a maximum average latency of 3.6 ms (Minimum 2.5 ms; 5000 trials). Conclusion: The novel KUKA LBR iiwa robot will advance the current robotized ultrasound tracking system with important features. With the developed framework, it is now possible to remotely control this robot and use it for robotized ultrasound tracking applications, including image quality control and target tracking.« less
Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William
2014-01-01
Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991
Smits, Rochelle; Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William
2014-03-14
Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases.
NASA Astrophysics Data System (ADS)
Kim, J.
2016-12-01
Considering high levels of uncertainty, epistemological conflicts over facts and values, and a sense of urgency, normal paradigm-driven science will be insufficient to mobilize people and nation toward sustainability. The conceptual framework to bridge the societal system dynamics with that of natural ecosystems in which humanity operates remains deficient. The key to understanding their coevolution is to understand `self-organization.' Information-theoretic approach may shed a light to provide a potential framework which enables not only to bridge human and nature but also to generate useful knowledge for understanding and sustaining the integrity of ecological-societal systems. How can information theory help understand the interface between ecological systems and social systems? How to delineate self-organizing processes and ensure them to fulfil sustainability? How to evaluate the flow of information from data through models to decision-makers? These are the core questions posed by sustainability science in which visioneering (i.e., the engineering of vision) is an essential framework. Yet, visioneering has neither quantitative measure nor information theoretic framework to work with and teach. This presentation is an attempt to accommodate the framework of self-organizing hierarchical open systems with visioneering into a common information-theoretic framework. A case study is presented with the UN/FAO's communal vision of climate-smart agriculture (CSA) which pursues a trilemma of efficiency, mitigation, and resilience. Challenges of delineating and facilitating self-organizing systems are discussed using transdisciplinary toold such as complex systems thinking, dynamic process network analysis and multi-agent systems modeling. Acknowledgments: This study was supported by the Korea Meteorological Administration Research and Development Program under Grant KMA-2012-0001-A (WISE project).
Efficient in-situ visualization of unsteady flows in climate simulation
NASA Astrophysics Data System (ADS)
Vetter, Michael; Olbrich, Stephan
2017-04-01
The simulation of climate data tends to produce very large data sets, which hardly can be processed in classical post-processing visualization applications. Typically, the visualization pipeline consisting of the processes data generation, visualization mapping and rendering is distributed into two parts over the network or separated via file transfer. Within most traditional post-processing scenarios the simulation is done on a supercomputer whereas the data analysis and visualization is done on a graphics workstation. That way temporary data sets with huge volume have to be transferred over the network, which leads to bandwidth bottlenecks and volume limitations. The solution to this issue is the avoidance of temporary storage, or at least significant reduction of data complexity. Within the Climate Visualization Lab - as part of the Cluster of Excellence "Integrated Climate System Analysis and Prediction" (CliSAP) at the University of Hamburg, in cooperation with the German Climate Computing Center (DKRZ) - we develop and integrate an in-situ approach. Our software framework DSVR is based on the separation of the process chain between the mapping and the rendering processes. It couples the mapping process directly to the simulation by calling methods of a parallelized data extraction library, which create a time-based sequence of geometric 3D scenes. This sequence is stored on a special streaming server with an interactive post-filtering option and then played-out asynchronously in a separate 3D viewer application. Since the rendering is part of this viewer application, the scenes can be navigated interactively. In contrast to other in-situ approaches where 2D images are created as part of the simulation or synchronous co-visualization takes place, our method supports interaction in 3D space and in time, as well as fixed frame rates. To integrate in-situ processing based on our DSVR framework and methods in the ICON climate model, we are continuously evolving the data structures and mapping algorithms of the framework to support the ICON model's native grid structures, since DSVR originally was designed for rectilinear grids only. We now have implemented a new output module to ICON to take advantage of the DSVR visualization. The visualization can be configured as most output modules by using a specific namelist and is exemplarily integrated within the non-hydrostatic atmospheric model time loop. With the integration of a DSVR based in-situ pathline extraction within ICON, a further milestone is reached. The pathline algorithm as well as the grid data structures have been optimized for the domain decomposition used for the parallelization of ICON based on MPI and OpenMP. The software implementation and evaluation is done on the supercomputers at DKRZ. In principle, the data complexity is reduced from O(n3) to O(m), where n is the grid resolution and m the number of supporting point of all pathlines. The stability and scalability evaluation is done using Atmospheric Model Intercomparison Project (AMIP) runs. We will give a short introduction in our software framework, as well as a short overview on the implementation and usage of DSVR within ICON. Furthermore, we will present visualization and evaluation results of sample applications.
Complete integrability of information processing by biochemical reactions
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-01-01
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling – based on spin systems – has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis–Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy – based on completely integrable hydrodynamic-type systems of PDEs – which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions. PMID:27812018
Complete integrability of information processing by biochemical reactions
NASA Astrophysics Data System (ADS)
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-11-01
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.
Complete integrability of information processing by biochemical reactions.
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-11-04
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tallis, Heather, E-mail: htallis@tnc.org; Kennedy, Christina M., E-mail: ckennedy@tnc.org; Ruckelshaus, Mary
Emerging development policies and lending standards call for consideration of ecosystem services when mitigating impacts from development, yet little guidance exists to inform this process. Here we propose a comprehensive framework for advancing both biodiversity and ecosystem service mitigation. We have clarified a means for choosing representative ecosystem service targets alongside biodiversity targets, identified servicesheds as a useful spatial unit for assessing ecosystem service avoidance, impact, and offset options, and discuss methods for consistent calculation of biodiversity and ecosystem service mitigation ratios. We emphasize the need to move away from area- and habitat-based assessment methods for both biodiversity and ecosystemmore » services towards functional assessments at landscape or seascape scales. Such comprehensive assessments more accurately reflect cumulative impacts and variation in environmental quality, social needs and value preferences. The integrated framework builds on the experience of biodiversity mitigation while addressing the unique opportunities and challenges presented by ecosystem service mitigation. These advances contribute to growing potential for economic development planning and execution that will minimize impacts on nature and maximize human wellbeing. - Highlights: • This is the first framework for biodiversity and ecosystem service mitigation. • Functional, landscape scale assessments are ideal for avoidance and offsets. • Servicesheds define the appropriate spatial extent for ecosystem service mitigation. • Mitigation ratios should be calculated consistently and based on standard factors. • Our framework meets the needs of integrated mitigation assessment requirements.« less
Heathcote, Andrew
2016-01-01
In the real world, decision making processes must be able to integrate non-stationary information that changes systematically while the decision is in progress. Although theories of decision making have traditionally been applied to paradigms with stationary information, non-stationary stimuli are now of increasing theoretical interest. We use a random-dot motion paradigm along with cognitive modeling to investigate how the decision process is updated when a stimulus changes. Participants viewed a cloud of moving dots, where the motion switched directions midway through some trials, and were asked to determine the direction of motion. Behavioral results revealed a strong delay effect: after presentation of the initial motion direction there is a substantial time delay before the changed motion information is integrated into the decision process. To further investigate the underlying changes in the decision process, we developed a Piecewise Linear Ballistic Accumulator model (PLBA). The PLBA is efficient to simulate, enabling it to be fit to participant choice and response-time distribution data in a hierarchal modeling framework using a non-parametric approximate Bayesian algorithm. Consistent with behavioral results, PLBA fits confirmed the presence of a long delay between presentation and integration of new stimulus information, but did not support increased response caution in reaction to the change. We also found the decision process was not veridical, as symmetric stimulus change had an asymmetric effect on the rate of evidence accumulation. Thus, the perceptual decision process was slow to react to, and underestimated, new contrary motion information. PMID:26760448
NASA Astrophysics Data System (ADS)
O'Neill, B. C.; Lawrence, P.; Ren, X.
2016-12-01
Collaboration between the integrated assessment modeling (IAM) and earth system modeling (ESM) communities is increasing, driven by a growing interest in research questions that require analysis integrating both social and natural science components. This collaboration often takes the form of integrating their respective models. There are a number of approaches available to implement this integration, ranging from one-way linkages to full two-way coupling, as well as approaches that retain a single modeling framework but improve the representation of processes from the other framework. We discuss the pros and cons of these different approaches and the conditions under which a two-way coupling of IAMs and ESMs would be favored over a one-way linkage. We propose a criterion that is necessary and sufficient to motivate two-way coupling: A human process must have an effect on an earth system process that is large enough to cause a change in the original human process that is substantial compared to other uncertainties in the problem being investigated. We then illustrate a test of this criterion for land use-climate interactions based on work using the Community Earth System Model (CESM) and land use scenarios from the Representative Concentration Pathways (RCPs), in which we find that the land use effect on regional climate is unlikely to meet the criterion. We then show an example of implementing a one-way linkage of land use and agriculture between an IAM, the integrated Population-Economy-Technology-Science (iPETS) model, and CESM that produces fully consistent outcomes between iPETS and the CESM land surface model. We use the linked system to model the influence of climate change on crop yields, agricultural land use, crop prices and food consumption under two alternative future climate scenarios. This application demonstrates the ability to link an IAM to a global land surface and climate model in a computationally efficient manner.
A new structural framework for integrating replication protein A into DNA processing machinery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brosey, Chris A; Yan, Chunli; Tsutakawa, Susan E
2013-01-01
By coupling the protection and organization of ssDNA with the recruitment and alignment of DNA processing factors, Replication Protein A (RPA) lies at the heart of dynamic multi-protein DNA processing machinery. Nevertheless, how RPA manages to coordinate the biochemical functions of its eight domains remains unknown. We examined the structural biochemistry of RPA s DNA binding activity, combining small-angle x-ray and neutron scattering with all-atom molecular dynamics simulations to investigate the architecture of RPA s DNA-binding core. It has been long held that RPA engages ssDNA in three stages, but our data reveal that RPA undergoes two rather than threemore » transitions as it binds ssDNA. In contrast to previous models, RPA is more compact when fully engaged on 20-30 nucleotides of ssDNA than when DNA-free, and there is no evidence for significant population of a highly compacted structure in the initial 8-10 nucleotide binding mode. These results provide a new framework for understanding the integration of ssDNA into DNA processing machinery and how binding partners may manipulate RPA architecture to gain access to the substrate.« less
Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S
2013-01-01
Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.
A Framework for Modeling Competitive and Cooperative Computation in Retinal Processing
NASA Astrophysics Data System (ADS)
Moreno-Díaz, Roberto; de Blasio, Gabriel; Moreno-Díaz, Arminda
2008-07-01
The structure of the retina suggests that it should be treated (at least from the computational point of view), as a layered computer. Different retinal cells contribute to the coding of the signals down to ganglion cells. Also, because of the nature of the specialization of some ganglion cells, the structure suggests that all these specialization processes should take place at the inner plexiform layer and they should be of a local character, prior to a global integration and frequency-spike coding by the ganglion cells. The framework we propose consists of a layered computational structure, where outer layers provide essentially with band-pass space-time filtered signals which are progressively delayed, at least for their formal treatment. Specialization is supposed to take place at the inner plexiform layer by the action of spatio-temporal microkernels (acting very locally), and having a centerperiphery space-time structure. The resulting signals are then integrated by the ganglion cells through macrokernels structures. Practically all types of specialization found in different vertebrate retinas, as well as the quasilinear behavior in some higher vertebrates, can be modeled and simulated within this framework. Finally, possible feedback from central structures is considered. Though their relevance to retinal processing is not definitive, it is included here for the sake of completeness, since it is a formal requisite for recursiveness.
Executive Function in Education: From Theory to Practice
ERIC Educational Resources Information Center
Meltzer, Lynn, Ed.
2007-01-01
This uniquely integrative book brings together leading researchers and practitioners from education, neuroscience, and psychology. It presents a theoretical framework for understanding executive function difficulties together with a range of effective approaches to assessment and instruction. Coverage includes executive function processes in…
USDA-ARS?s Scientific Manuscript database
Qualitative Rangeland Health Assessments are extremely useful because they provide a relative indication of resource problems on rangelands. Additionally, theSuccessional Management framework identifies three primary causes of plantcommunity change, ecological processes, and factors that modify thes...
Software Reviews Since Acquisition Reform - The Artifact Perspective
2004-01-01
Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality
DOT National Transportation Integrated Search
1999-09-01
This report highlights cross-cutting findings and perspectives gleaned from a series of case studies that examined the development processes of regional and statewide Intelligent Transportation Systems (ITS) architectures. Each of the case studies is...
An Integrative Data Mining Approach to Identify Adverse Outcome Pathway Signatures
Adverse Outcome Pathways (AOPs) provide a formal framework for describing the mechanisms underlying the toxicity of chemicals in our environment. This process improves our ability to incorporate high-throughput toxicity testing (HTT) results and biomarker information on early key...
Complexities of Organization Dynamics and Development: Leaders and Managers
ERIC Educational Resources Information Center
Nderu-Boddington, Eulalee
2008-01-01
This article shows the theoretical framework for understanding organizational dynamics and development - the change theory and subordinate relationships within contemporary organizations. The emphasis is on power strategies and the relationship to organizational dynamics and development. The integrative process broadens the understanding of…
Software framework for the upcoming MMT Observatory primary mirror re-aluminization
NASA Astrophysics Data System (ADS)
Gibson, J. Duane; Clark, Dusty; Porter, Dallan
2014-07-01
Details of the software framework for the upcoming in-situ re-aluminization of the 6.5m MMT Observatory (MMTO) primary mirror are presented. This framework includes: 1) a centralized key-value store and data structure server for data exchange between software modules, 2) a newly developed hardware-software interface for faster data sampling and better hardware control, 3) automated control algorithms that are based upon empirical testing, modeling, and simulation of the aluminization process, 4) re-engineered graphical user interfaces (GUI's) that use state-of-the-art web technologies, and 5) redundant relational databases for data logging. Redesign of the software framework has several objectives: 1) automated process control to provide more consistent and uniform mirror coatings, 2) optional manual control of the aluminization process, 3) modular design to allow flexibility in process control and software implementation, 4) faster data sampling and logging rates to better characterize the approximately 100-second aluminization event, and 5) synchronized "real-time" web application GUI's to provide all users with exactly the same data. The framework has been implemented as four modules interconnected by a data store/server. The four modules are integrated into two Linux system services that start automatically at boot-time and remain running at all times. Performance of the software framework is assessed through extensive testing within 2.0 meter and smaller coating chambers at the Sunnyside Test Facility. The redesigned software framework helps ensure that a better performing and longer lasting coating will be achieved during the re-aluminization of the MMTO primary mirror.
ASSIP Study of Real-Time Safety-Critical Embedded Software-Intensive System Engineering Practices
2008-02-01
and assessment 2. product engineering processes 3. tooling processes 6 | CMU/SEI-2008-SR-001 Slide 1 Process Standards IEC/ ISO 12207 Software...and technical effort to align with 12207 IEC/ ISO 15026 System & Software Integrity Levels Generic Safety SAE ARP 4754 Certification Considerations...Process Frameworks in revision – ISO 9001, ISO 9004 – ISO 15288/ ISO 12207 harmonization – RTCA DO-178B, MOD Standard UK 00-56/3, … • Methods & Tools
An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)
NASA Astrophysics Data System (ADS)
van den Heever, Lize; Marais, Neilen; Slabber, Martin
2016-08-01
This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.
Rooting Theories of Plant Community Ecology in Microbial Interactions
Bever, James D.; Dickie, Ian A.; Facelli, Evelina; Facelli, Jose M.; Klironomos, John; Moora, Mari; Rillig, Matthias C.; Stock, William D.; Tibbett, Mark; Zobel, Martin
2010-01-01
Predominant frameworks for understanding plant ecology have an aboveground bias that neglects soil micro-organisms. This is inconsistent with recent work illustrating the importance of soil microbes in terrestrial ecology. Microbial effects have been incorporated into plant community dynamics using ideas of niche modification and plant-soil community feedbacks. Here, we expand and integrate qualitative conceptual models of plant niche and feedback to explore implications of microbial interactions for understanding plant community ecology. At the same time we review the empirical evidence for these processes. We also consider common mycorrhizal networks, and suggest these are best interpreted within the feedback framework. Finally, we apply our integrated model of niche and feedback to understanding plant coexistence, monodominance, and invasion ecology. PMID:20557974
NASA Astrophysics Data System (ADS)
Hahm, W.; Riebe, C. S.; Ferrier, K.; Kirchner, J. W.
2011-12-01
Traditional frameworks for conceptualizing hillslope denudation distinguish between the movement of mass in solution (chemical erosion) and mass moved via mechanical processes (physical erosion). At the hillslope scale, physical and chemical erosion rates can be quantified by combining measurements of regolith chemistry with cosmogenic nuclide concentrations in bedrock and sediment, while basin-scale rates are often inferred from riverine solute and sediment loads. These techniques integrate the effects of numerous weathering and erosion mechanisms and do not provide prima facie information about the precise nature and scale of those mechanisms. For insight into erosional process, physical erosion has been considered in terms of two limiting regimes. When physical erosion outpaces weathering front advance, regolith is mobilized downslope as soon as it is sufficiently loosened by weathering, and physical erosion rates are limited by rates of mobile regolith production. This is commonly termed weathering-limited erosion. Conversely, when weathering front advance outpaces erosion, the mobile regolith layer grows thicker over time, and physical erosion rates are limited by the efficiency of downslope transport processes. This is termed transport-limited erosion. This terminology brings the description of hillslope evolution closer to the realm of essential realism, to the extent that measurable quantities from the field can be cast in a process-based framework. An analogous process-limitation framework describes chemical erosion. In supply-limited chemical erosion, chemical weathering depletes regolith of its reactive phases during residence on a hillslope, and chemical erosion rates are limited by the supply of fresh minerals to the weathering zone. Alternatively, hillslopes may exhibit kinetic-limited chemical erosion, where physical erosion transports regolith downslope before weatherable phases are completely removed by chemical erosion. We show how supply- and kinetic-limited chemical erosion can be distinguished from one another using data from a global compilation of physical and chemical erosion rates. As a step towards understanding these rates at the level of essential realism, we explore how the hillslope-scale regimes of supply- and kinetic-limited chemical erosion relate to existing conceptual frameworks that interpret weathering rates in terms of transport- and kinetic-limitation at the mineral scale.
Semantics-Based Interoperability Framework for the Geosciences
NASA Astrophysics Data System (ADS)
Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.
2008-12-01
Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will facilitate the integrative capabilities of scientists as we examine the relationships between data and external factors such as processes that may influence our understanding of "why" certain events happen. We emphasize the need to go from analysis of data to concepts related to scientific principles of thermodynamics, kinetics, heat flow, mass transfer, etc. Towards meeting these objectives, we report on a pair of related service engines: DIA (Discovery, integration and analysis), and SEDRE (Semantically-Enabled Data Registration Engine) that utilize ontologies for semantic interoperability and integration.
Reframing the challenges to integrated care: a complex-adaptive systems perspective.
Tsasis, Peter; Evans, Jenna M; Owen, Susan
2012-01-01
Despite over two decades of international experience and research on health systems integration, integrated care has not developed widely. We hypothesized that part of the problem may lie in how we conceptualize the integration process and the complex systems within which integrated care is enacted. This study aims to contribute to discourse regarding the relevance and utility of a complex-adaptive systems (CAS) perspective on integrated care. In the Canadian province of Ontario, government mandated the development of fourteen Local Health Integration Networks in 2006. Against the backdrop of these efforts to integrate care, we collected focus group data from a diverse sample of healthcare professionals in the Greater Toronto Area using convenience and snowball sampling. A semi-structured interview guide was used to elicit participant views and experiences of health systems integration. We use a CAS framework to describe and analyze the data, and to assess the theoretical fit of a CAS perspective with the dominant themes in participant responses. Our findings indicate that integration is challenged by system complexity, weak ties and poor alignment among professionals and organizations, a lack of funding incentives to support collaborative work, and a bureaucratic environment based on a command and control approach to management. Using a CAS framework, we identified several characteristics of CAS in our data, including diverse, interdependent and semi-autonomous actors; embedded co-evolutionary systems; emergent behaviours and non-linearity; and self-organizing capacity. One possible explanation for the lack of systems change towards integration is that we have failed to treat the healthcare system as complex-adaptive. The data suggest that future integration initiatives must be anchored in a CAS perspective, and focus on building the system's capacity to self-organize. We conclude that integrating care requires policies and management practices that promote system awareness, relationship-building and information-sharing, and that recognize change as an evolving learning process rather than a series of programmatic steps.
Social cognitive neuroscience and humanoid robotics.
Chaminade, Thierry; Cheng, Gordon
2009-01-01
We believe that humanoid robots provide new tools to investigate human social cognition, the processes underlying everyday interactions between individuals. Resonance is an emerging framework to understand social interactions that is based on the finding that cognitive processes involved when experiencing a mental state and when perceiving another individual experiencing the same mental state overlap, both at the behavioral and neural levels. We will first review important aspects of his framework. In a second part, we will discuss how this framework is used to address questions pertaining to artificial agents' social competence. We will focus on two types of paradigm, one derived from experimental psychology and the other using neuroimaging, that have been used to investigate humans' responses to humanoid robots. Finally, we will speculate on the consequences of resonance in natural social interactions if humanoid robots are to become integral part of our societies.
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
A review of cognitive therapy in acute medical settings. Part I: therapy model and assessment.
Levin, Tomer T; White, Craig A; Kissane, David W
2013-04-01
Although cognitive therapy (CT) has established outpatient utility, there is no integrative framework for using CT in acute medical settings where most psychosomatic medicine (P-M) clinicians practice. Biopsychosocial complexity challenges P-M clinicians who want to use CT as the a priori psychotherapeutic modality. For example, how should clinicians modify the data gathering and formulation process to support CT in acute settings? Narrative review methodology is used to describe the framework for a CT informed interview, formulation, and assessment in acute medical settings. Because this review is aimed largely at P-M trainees and educators, exemplary dialogues model the approach (specific CT strategies for common P-M scenarios appear in the companion article.) Structured data gathering needs to be tailored by focusing on cognitive processes informed by the cognitive hypothesis. Agenda setting, Socratic questioning, and adaptations to the mental state examination are necessary. Specific attention is paid to the CT formulation, Folkman's Cognitive Coping Model, self-report measures, data-driven evaluations, and collaboration (e.g., sharing the formulation with the patient.) Integrative CT-psychopharmacological approaches and the importance of empathy are emphasized. The value of implementing psychotherapy in parallel with data gathering because of time urgency is advocated, but this is a significant departure from usual outpatient approaches in which psychotherapy follows evaluation. This conceptual approach offers a novel integrative framework for using CT in acute medical settings, but future challenges include demonstrating clinical outcomes and training P-M clinicians so as to demonstrate fidelity.
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned
NASA Technical Reports Server (NTRS)
Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.
2001-01-01
Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.
Semantics-enabled service discovery framework in the SIMDAT pharma grid.
Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert
2008-03-01
We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.
[Challenges in geriatric rehabilitation: the development of an integrated care pathway].
Everink, Irma Helga Johanna; van Haastregt, Jolanda C M; Kempen, Gertrudis I J M; Dielis, Leen M J; Maessen, José M C; Schols, Jos M G A
2015-04-01
Coordination and continuity of care within geriatric rehabilitation is challenging. To tackle these challenges, an integrated care pathway within geriatric rehabilitation care (hospital, geriatric rehabilitation and follow-up care in the home situation) has been developed. The aim of this article is to expound the process of developing the integrated care pathway, and to describe and discuss the results of this process (which is the integrated care pathway). Developing the integrated care pathway was done by the guidance of the first four steps of the theoretical framework for implementation of change from Grol and Wensing: (1) development of a specific proposal for change in practice; (2) analysis of current care practice; (3) analysis of the target group and setting; and (4) development and selection of interventions/strategies for change. The organizations involved in geriatric rehabilitation argued that the integrated care pathway should focus on improving the process of care, including transfer of patients, handovers and communication between care organizations. Current practice, barriers and incentives for change were analyzed through literature research, expert consultation, and interviews with the involved caregivers and by establishing working groups of health care professionals, patients and informal caregivers. This resulted in valuable proposals for improvement of the care process, which were gathered and combined in the integrated care pathway. The integrated care pathway entails agreements on (a) the triage process in the hospital; (b) active engagement of patients and informal caregivers in the care process; (c) timely and high quality handovers; and (d) improved communication between caregivers.
Van Hoey, Gert; Borja, Angel; Birchenough, Silvana; Buhl-Mortensen, Lene; Degraer, Steven; Fleischer, Dirk; Kerckhof, Francis; Magni, Paolo; Muxika, Iñigo; Reiss, Henning; Schröder, Alexander; Zettler, Michael L
2010-12-01
The Water Framework Directive (WFD) and the Marine Strategy Framework Directive (MSFD) are the European umbrella regulations for water systems. It is a challenge for the scientific community to translate the principles of these directives into realistic and accurate approaches. The aim of this paper, conducted by the Benthos Ecology Working Group of ICES, is to describe how the principles have been translated, which were the challenges and best way forward. We have tackled the following principles: the ecosystem-based approach, the development of benthic indicators, the definition of 'pristine' or sustainable conditions, the detection of pressures and the development of monitoring programs. We concluded that testing and integrating the different approaches was facilitated during the WFD process, which led to further insights and improvements, which the MSFD can rely upon. Expert involvement in the entire implementation process proved to be of vital importance. Copyright © 2010 Elsevier Ltd. All rights reserved.
Binot, Aurelie; Duboz, Raphaël; Promburom, Panomsak; Phimpraphai, Waraphon; Cappelle, Julien; Lajaunie, Claire; Goutard, Flavie Luce; Pinyopummintr, Tanu; Figuié, Muriel; Roger, François Louis
2015-12-01
As Southeast Asia (SEA) is characterized by high human and domestic animal densities, growing intensification of trade, drastic land use changes and biodiversity erosion, this region appears to be a hotspot to study complex dynamics of zoonoses emergence and health issues at the Animal-Human-Environment interface. Zoonotic diseases and environmental health issues can have devastating socioeconomic and wellbeing impacts. Assessing and managing the related risks implies to take into account ecological and social dynamics at play, in link with epidemiological patterns. The implementation of a One Health ( OH ) approach in this context calls for improved integration among disciplines and improved cross-sectoral collaboration, involving stakeholders at different levels. For sure, such integration is not achieved spontaneously, implies methodological guidelines and has transaction costs. We explore pathways for implementing such collaboration in SEA context, highlighting the main challenges to be faced by researchers and other target groups involved in OH actions. On this basis, we propose a conceptual framework of OH integration. Throughout 3 components (field-based data management, professional training workshops and higher education), we suggest to develop a new culture of networking involving actors from various disciplines, sectors and levels (from the municipality to the Ministries) through a participatory modelling process, fostering synergies and cooperation. This framework could stimulate long-term dialogue process, based on the combination of case studies implementation and capacity building. It aims for implementing both institutional OH dynamics (multi-stakeholders and cross-sectoral) and research approaches promoting systems thinking and involving social sciences to follow-up and strengthen collective action.
Modeling Sea-Level Change using Errors-in-Variables Integrated Gaussian Processes
NASA Astrophysics Data System (ADS)
Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin
2014-05-01
We perform Bayesian inference on historical and late Holocene (last 2000 years) rates of sea-level change. The data that form the input to our model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. To accurately estimate rates of sea-level change and reliably compare tide-gauge compilations with proxy reconstructions it is necessary to account for the uncertainties that characterize each dataset. Many previous studies used simple linear regression models (most commonly polynomial regression) resulting in overly precise rate estimates. The model we propose uses an integrated Gaussian process approach, where a Gaussian process prior is placed on the rate of sea-level change and the data itself is modeled as the integral of this rate process. The non-parametric Gaussian process model is known to be well suited to modeling time series data. The advantage of using an integrated Gaussian process is that it allows for the direct estimation of the derivative of a one dimensional curve. The derivative at a particular time point will be representative of the rate of sea level change at that time point. The tide gauge and proxy data are complicated by multiple sources of uncertainty, some of which arise as part of the data collection exercise. Most notably, the proxy reconstructions include temporal uncertainty from dating of the sediment core using techniques such as radiocarbon. As a result of this, the integrated Gaussian process model is set in an errors-in-variables (EIV) framework so as to take account of this temporal uncertainty. The data must be corrected for land-level change known as glacio-isostatic adjustment (GIA) as it is important to isolate the climate-related sea-level signal. The correction for GIA introduces covariance between individual age and sea level observations into the model. The proposed integrated Gaussian process model allows for the estimation of instantaneous rates of sea-level change and accounts for all available sources of uncertainty in tide-gauge and proxy-reconstruction data. Our response variable is sea level after correction for GIA. By embedding the integrated process in an errors-in-variables (EIV) framework, and removing the estimate of GIA, we can quantify rates with better estimates of uncertainty than previously possible. The model provides a flexible fit and enables us to estimate rates of change at any given time point, thus observing how rates have been evolving from the past to present day.
Integrating the environment in local strategic planning : Guidelines (Case of Morocco)
NASA Astrophysics Data System (ADS)
Benbrahim, Hafsa
2018-05-01
Since 2010, an advanced regionalization project has been initiated by Morocco, which plans to consolidate the processes of decentralization and deconcentration by extending the powers of the regions and other local authorities. This project, institutionalized in the 2011 Constitution, defines the territorial organization of the Kingdom and reinforces decentralization according to a model of advanced regionalization. Through advanced regionalization, Morocco aims at integrated and sustainable development in economic, social, cultural and environmental terms, through the development of the potential and resources of each region. However, in order to honor this commitment of advanced regionalization, local authorities must be assisted in adopting a local strategic planning approach, allowing them to develop territorial plans for sustainable development in accordance with the national legal framework, specifically the Framework law 99-12, and international commitments in terms of environmental protection. This research deals with the issue of environmental governance in relation to the role and duties of local authorities. Thus, the main goal of our study is to present the guidelines to be followed by the local authorities to improve the quality of the environment integration process in the local strategic planning with the aim of putting it in a perspective of sustainable development.
Multi-threaded integration of HTC-Vive and MeVisLab
NASA Astrophysics Data System (ADS)
Gunacker, Simon; Gall, Markus; Schmalstieg, Dieter; Egger, Jan
2018-03-01
This work presents how Virtual Reality (VR) can easily be integrated into medical applications via a plugin for a medical image processing framework called MeVisLab. A multi-threaded plugin has been developed using OpenVR, a VR library that can be used for developing vendor and platform independent VR applications. The plugin is tested using the HTC Vive, a head-mounted display developed by HTC and Valve Corporation.
NASA Astrophysics Data System (ADS)
Laban, Shaban; El-Desouky, Aly
2014-05-01
To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.
A framework for monitoring social process and outcomes in environmental programs.
Chapman, Sarah
2014-12-01
When environmental programs frame their activities as being in the service of human wellbeing, social variables need to be integrated into monitoring and evaluation (M&E) frameworks. This article draws upon ecosystem services theory to develop a framework to guide the M&E of collaborative environmental programs with anticipated social benefits. The framework has six components: program need, program activities, pathway process variables, moderating process variables, outcomes, and program value. Needs are defined in terms of ecosystem services, as well as other human needs that must be addressed to achieve outcomes. The pathway variable relates to the development of natural resource governance capacity in the target community. Moderating processes can be externalities such as the inherent capacity of the natural system to service ecosystem needs, local demand for natural resources, policy or socio-economic drivers. Internal program-specific processes relate to program service delivery, targeting and participant responsiveness. Ecological outcomes are expressed in terms of changes in landscape structure and function, which in turn influence ecosystem service provision. Social benefits derived from the program are expressed in terms of the value of the eco-social service to user-specified goals. The article provides suggestions from the literature for identifying indicators and measures for components and component variables, and concludes with an example of how the framework was used to inform the M&E of an adaptive co-management program in western Kenya. Copyright © 2014 Elsevier Ltd. All rights reserved.
Saluja, Saurabh; Silverstein, Allison; Mukhopadhyay, Swagoto; Lin, Yihan; Raykar, Nakul; Keshavjee, Salmaan; Samad, Lubna; Meara, John G
2017-01-01
The Lancet Commission on Global Surgery defined six surgical indicators and a framework for a national surgical plan that aimed to incorporate surgical care as a part of global public health. Multiple countries have since begun national surgical planning; each faces unique challenges in doing so. Implementation science can be used to more systematically explain this heterogeneous process, guide implementation efforts and ultimately evaluate progress. We describe our intervention using the Consolidated Framework for Implementation Research. This framework requires identifying characteristics of the intervention, the individuals involved, the inner and outer setting of the intervention, and finally describing implementation processes. By hosting a consultative symposium with clinicians and policy makers from around the world, we are able to specify key aspects of each element of this framework. We define our intervention as the incorporation of surgical care into public health planning, identify local champions as the key individuals involved, and describe elements of the inner and outer settings. Ultimately we describe top-down and bottom-up models that are distinct implementation processes. With the Consolidated Framework for Implementation Research, we are able to identify specific strategic models that can be used by implementers in various settings. While the integration of surgical care into public health throughout the world may seem like an insurmountable challenge, this work adds to a growing effort that seeks to find a way forward. PMID:29225930
SWIFT Differentiated Technical Assistance. White Paper
ERIC Educational Resources Information Center
McCart, Amy; McSheehan, Michael; Sailor, Wayne; Mitchiner, Melinda; Quirk, Carol
2016-01-01
The Schoolwide Integrated Framework for Transformation (SWIFT) employs six technical assistance (TA) practices that support an initial transformation process while simultaneously building system capacity to sustain and scale up equity-based inclusion in additional schools and districts over time. This paper explains these individual practices and…
DOT National Transportation Integrated Search
1999-09-01
This is one of seven studies exploring processes for developing Intelligent Transportation Systems (ITS) architectures for regional, statewide, or commercial vehicle applications. This study was prepared for a broad-based, non-technical audience. The...
Categorization and Affect: Evidence for Intra-Hemispheric Interactions
ERIC Educational Resources Information Center
Ramon, Dan; Doron, Yonit; Faust, Miriam
2007-01-01
Both emotional reactivity and categorization have long been studied within the framework of hemispheric asymmetry. However, little attempt has been made to integrate both research areas using any form of neuropsychological research, despite behavioral data suggesting a consistent relationship between affective and categorization processes. The…
Federated Simulations for Systems of Systems Integration
2008-12-01
coordination of the other SysHub research areas. A more comprehensive systems engineering process has been proposed in (Tolk, Litwin , and Kewley 2008...missions and means framework. Technical Report TR-756, Army Material Systems Analysis Activity. Tolk, A., T. Litwin , and R. Kewley. 2008, December. A
DOT National Transportation Integrated Search
1999-09-01
This is one of seven studies exploring processes for developing Intelligent Transportation Systems (ITS) architectures for regional, statewide, or commercial vehicle applications. This study was prepared for a broad-based, non-technical audience. In ...
DOT National Transportation Integrated Search
1999-07-01
This report presents an examination of the process used in preparing electronic credentials for commercial vehicle operations in Kentucky Maryland, and Virginia. It describes the experience of using the Commercial Vehicle Information Systems & Networ...
Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic met...
Virtual Plant Tissue: Building Blocks for Next-Generation Plant Growth Simulation
De Vos, Dirk; Dzhurakhalov, Abdiravuf; Stijven, Sean; Klosiewicz, Przemyslaw; Beemster, Gerrit T. S.; Broeckhove, Jan
2017-01-01
Motivation: Computational modeling of plant developmental processes is becoming increasingly important. Cellular resolution plant tissue simulators have been developed, yet they are typically describing physiological processes in an isolated way, strongly delimited in space and time. Results: With plant systems biology moving toward an integrative perspective on development we have built the Virtual Plant Tissue (VPTissue) package to couple functional modules or models in the same framework and across different frameworks. Multiple levels of model integration and coordination enable combining existing and new models from different sources, with diverse options in terms of input/output. Besides the core simulator the toolset also comprises a tissue editor for manipulating tissue geometry and cell, wall, and node attributes in an interactive manner. A parameter exploration tool is available to study parameter dependence of simulation results by distributing calculations over multiple systems. Availability: Virtual Plant Tissue is available as open source (EUPL license) on Bitbucket (https://bitbucket.org/vptissue/vptissue). The project has a website https://vptissue.bitbucket.io. PMID:28523006
Employee commitment and motivation: a conceptual analysis and integrative model.
Myer, John P; Becker, Thomas E; Vandenberghe, Christian
2004-12-01
Theorists and researchers interested in employee commitment and motivation have not made optimal use of each other's work. Commitment researchers seldom address the motivational processes through which commitment affects behavior, and motivation researchers have not recognized important distinctions in the forms, foci, and bases of commitment. To encourage greater cross-fertilization, the authors present an integrative framework in which commitment is presented as one of several energizing forces for motivated behavior. E. A. Locke's (1997) model of the work motivation process and J. P. Meyer and L. Herscovitch's (2001) model of workplace commitments serve as the foundation for the development of this new framework. To facilitate the merger, a new concept, goal regulation, is derived from self-determination theory (E. L. Deci & R. M. Ryan, 1985) and regulatory focus theory (E. I. Higgins, 1997). By including goal regulation, it is acknowledged that motivated behavior can be accompanied by different mindsets that have particularly important implications for the explanation and prediction of discretionary work behavior. 2004 APA, all rights reserved
Almén, Anja; Båth, Magnus
2016-06-01
The overall aim of the present work was to develop a conceptual framework for managing radiation dose in diagnostic radiology with the intention to support optimisation. An optimisation process was first derived. The framework for managing radiation dose, based on the derived optimisation process, was then outlined. The outset of the optimisation process is four stages: providing equipment, establishing methodology, performing examinations and ensuring quality. The optimisation process comprises a series of activities and actions at these stages. The current system of diagnostic reference levels is an activity in the last stage, ensuring quality. The system becomes a reactive activity only to a certain extent engaging the core activity in the radiology department, performing examinations. Three reference dose levels-possible, expected and established-were assigned to the three stages in the optimisation process, excluding ensuring quality. A reasonably achievable dose range is also derived, indicating an acceptable deviation from the established dose level. A reasonable radiation dose for a single patient is within this range. The suggested framework for managing radiation dose should be regarded as one part of the optimisation process. The optimisation process constitutes a variety of complementary activities, where managing radiation dose is only one part. This emphasises the need to take a holistic approach integrating the optimisation process in different clinical activities. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
On Cognition, Structured Sequence Processing, and Adaptive Dynamical Systems
NASA Astrophysics Data System (ADS)
Petersson, Karl Magnus
2008-11-01
Cognitive neuroscience approaches the brain as a cognitive system: a system that functionally is conceptualized in terms of information processing. We outline some aspects of this concept and consider a physical system to be an information processing device when a subclass of its physical states can be viewed as representational/cognitive and transitions between these can be conceptualized as a process operating on these states by implementing operations on the corresponding representational structures. We identify a generic and fundamental problem in cognition: sequentially organized structured processing. Structured sequence processing provides the brain, in an essential sense, with its processing logic. In an approach addressing this problem, we illustrate how to integrate levels of analysis within a framework of adaptive dynamical systems. We note that the dynamical system framework lends itself to a description of asynchronous event-driven devices, which is likely to be important in cognition because the brain appears to be an asynchronous processing system. We use the human language faculty and natural language processing as a concrete example through out.
A cognitive information processing framework for distributed sensor networks
NASA Astrophysics Data System (ADS)
Wang, Feiyi; Qi, Hairong
2004-09-01
In this paper, we present a cognitive agent framework (CAF) based on swarm intelligence and self-organization principles, and demonstrate it through collaborative processing for target classification in sensor networks. The framework involves integrated designs to provide both cognitive behavior at the organization level to conquer complexity and reactive behavior at the individual agent level to retain simplicity. The design tackles various problems in the current information processing systems, including overly complex systems, maintenance difficulties, increasing vulnerability to attack, lack of capability to tolerate faults, and inability to identify and cope with low-frequency patterns. An important and distinguishing point of the presented work from classical AI research is that the acquired intelligence does not pertain to distinct individuals but to groups. It also deviates from multi-agent systems (MAS) due to sheer quantity of extremely simple agents we are able to accommodate, to the degree that some loss of coordination messages and behavior of faulty/compromised agents will not affect the collective decision made by the group.
Java Tool Framework for Automation of Hardware Commissioning and Maintenance Procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, J C; Fisher, J M; Gordon, J B
2007-10-02
The National Ignition Facility (NIF) is a 192-beam laser system designed to study high energy density physics. Each beam line contains a variety of line replaceable units (LRUs) that contain optics, stepping motors, sensors and other devices to control and diagnose the laser. During commissioning and subsequent maintenance of the laser, LRUs undergo a qualification process using the Integrated Computer Control System (ICCS) to verify and calibrate the equipment. The commissioning processes are both repetitive and tedious when we use remote manual computer controls, making them ideal candidates for software automation. Maintenance and Commissioning Tool (MCT) software was developed tomore » improve the efficiency of the qualification process. The tools are implemented in Java, leveraging ICCS services and CORBA to communicate with the control devices. The framework provides easy-to-use mechanisms for handling configuration data, task execution, task progress reporting, and generation of commissioning test reports. The tool framework design and application examples will be discussed.« less
Interpersonal emotion regulation.
Zaki, Jamil; Williams, W Craig
2013-10-01
Contemporary emotion regulation research emphasizes intrapersonal processes such as cognitive reappraisal and expressive suppression, but people experiencing affect commonly choose not to go it alone. Instead, individuals often turn to others for help in shaping their affective lives. How and under what circumstances does such interpersonal regulation modulate emotional experience? Although scientists have examined allied phenomena such as social sharing, empathy, social support, and prosocial behavior for decades, there have been surprisingly few attempts to integrate these data into a single conceptual framework of interpersonal regulation. Here we propose such a framework. We first map a "space" differentiating classes of interpersonal regulation according to whether an individual uses an interpersonal regulatory episode to alter their own or another person's emotion. We then identify 2 types of processes--response-dependent and response-independent--that could support interpersonal regulation. This framework classifies an array of processes through which interpersonal contact fulfills regulatory goals. More broadly, it organizes diffuse, heretofore independent data on "pieces" of interpersonal regulation, and identifies growth points for this young and exciting research domain.
Arora, Prerna G; Connors, Elizabeth H; Blizzard, Angela; Coble, Kelly; Gloff, Nicole; Pruitt, David
2017-02-01
Increased attention has been placed on evaluating the extent to which clinical programs that support the behavioral health needs of youth have effective processes and result in improved patient outcomes. Several theoretical frameworks from dissemination and implementation (D&I) science have been put forth to guide the evaluation of behavioral health program implemented in the context of real-world settings. Although a strong rationale for the integration of D&I science in program evaluation exists, few examples exist available to guide the evaluator in integrating D&I science in the planning and execution of evaluation activities. This paper seeks to inform program evaluation efforts by outlining two D&I frameworks and describing their integration in program evaluation design. Specifically, this paper seeks to support evaluation efforts by illustrating the use of these frameworks via a case example of a telemental health consultation program in pediatric primary care designed to improve access to behavioral health care for children and adolescents in rural settings. Lessons learned from this effort, as well as recommendations regarding the future evaluation of programs using D&I science to support behavioral health care in community-based settings are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Big data and high-performance analytics in structural health monitoring for bridge management
NASA Astrophysics Data System (ADS)
Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed
2016-04-01
Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.
Intelligent Integrated Health Management for a System of Systems
NASA Technical Reports Server (NTRS)
Smith, Harvey; Schmalzel, John; Figueroa, Fernando
2008-01-01
An intelligent integrated health management system (IIHMS) incorporates major improvements over prior such systems. The particular IIHMS is implemented for any system defined as a hierarchical distributed network of intelligent elements (HDNIE), comprising primarily: (1) an architecture (Figure 1), (2) intelligent elements, (3) a conceptual framework and taxonomy (Figure 2), and (4) and ontology that defines standards and protocols. Some definitions of terms are prerequisite to a further brief description of this innovation: A system-of-systems (SoS) is an engineering system that comprises multiple subsystems (e.g., a system of multiple possibly interacting flow subsystems that include pumps, valves, tanks, ducts, sensors, and the like); 'Intelligent' is used here in the sense of artificial intelligence. An intelligent element may be physical or virtual, it is network enabled, and it is able to manage data, information, and knowledge (DIaK) focused on determining its condition in the context of the entire SoS; As used here, 'health' signifies the functionality and/or structural integrity of an engineering system, subsystem, or process (leading to determination of the health of components); 'Process' can signify either a physical process in the usual sense of the word or an element into which functionally related sensors are grouped; 'Element' can signify a component (e.g., an actuator, a valve), a process, a controller, an actuator, a subsystem, or a system; The term Integrated System Health Management (ISHM) is used to describe a capability that focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, prognosis of future anomalies), and provide data, information, and knowledge (DIaK) not just data to control systems for safe and effective operation. A major novel aspect of the present development is the concept of intelligent integration. The purpose of intelligent integration, as defined and implemented in the present IIHMS, is to enable automated analysis of physical phenomena in imitation of human reasoning, including the use of qualitative methods. Intelligent integration is said to occur in a system in which all elements are intelligent and can acquire, maintain, and share knowledge and information. In the HDNIE of the present IIHMS, an SoS is represented as being operationally organized in a hierarchical-distributed format. The elements of the SoS are considered to be intelligent in that they determine their own conditions within an integrated scheme that involves consideration of data, information, knowledge bases, and methods that reside in all elements of the system. The conceptual framework of the HDNIE and the methodologies of implementing it enable the flow of information and knowledge among the elements so as to make possible the determination of the condition of each element. The necessary information and knowledge is made available to each affected element at the desired time, satisfying a need to prevent information overload while providing context-sensitive information at the proper level of detail. Provision of high-quality data is a central goal in designing this or any IIHMS. In pursuit of this goal, functionally related sensors are logically assigned to groups denoted processes. An aggregate of processes is considered to form a system. Alternatively or in addition to what has been said thus far, the HDNIE of this IIHMS can be regarded as consisting of a framework containing object models that encapsulate all elements of the system, their individual and relational knowledge bases, generic methods and procedures based on models of the applicable physics, and communication processes (Figure 2). The framework enables implementation of a paradigm inspired by how expert operators monitor the health of systems with the help of (1) DIaK from various sources, (2) software tools that assist in rapid visualization of the condition of the system, (3) analical software tools that assist in reasoning about the condition, (4) sharing of information via network communication hardware and software, and (5) software tools that aid in making decisions to remedy unacceptable conditions or improve performance.
NASA Astrophysics Data System (ADS)
Reis, S.; Fleming, L. E.; Beck, S.; Austen, M.; Morris, G.; White, M.; Taylor, T. J.; Orr, N.; Osborne, N. J.; Depledge, M.
2014-12-01
Conceptual models for problem framing in environmental (EIA) and health impact assessment (HIA) share similar concepts, but differ in their scientific or policy focus, methodologies and underlying causal chains, and the degree of complexity and scope. The Driver-Pressure-State-Impact-Response (DPSIR) framework used by the European Environment Agency, the OECD and others and the Integrated Science for Society and the Environment (ISSE) frameworks are widely applied in policy appraisal and impact assessments. While DPSIR is applied across different policy domains, the ISSE framework is used in Ecosystem Services assessments. The modified Driver-Pressure-State-Exposure-Effect-Action (DPSEEA) model extends DPSIR by separating exposure from effect, adding context as a modifier of effect, and susceptibility to exposures due to socio-economic, demographic or other determinants. While continuously evolving, the application of conceptual frameworks in policy appraisals mainly occurs within established discipline boundaries. However, drivers and environmental states, as well as policy measures and actions, affect both human and ecosystem receptors. Furthermore, unintended consequences of policy actions are seldom constrained within discipline or policy silos. Thus, an integrated conceptual model is needed, accounting for the full causal chain affecting human and ecosystem health in any assessment. We propose a novel model integrating HIA methods and ecosystem services in an attempt to operationalise the emerging concept of "Ecological Public Health." The conceptual approach of the ecosystem-enriched DPSEEA model ("eDPSEEA") has stimulated wide-spread debates and feedback. We will present eDPSEEA as a stakeholder engagement process and a conceptual model, using illustrative case studies of climate change as a starting point, not a complete solution, for the integration of human and ecosystem health impact assessment as a key challenge in a rapidly changing world. Rayner G and Lang T Ecological Public Health: Reshaping the Conditions for Good Health. Routledge Publishers; 2012.Reis S, Morris G, Fleming LE, Beck S, Taylor T, White M, Depledge MH, Steinle S, Sabel CE, Cowie H, Hurley F, Dick JMcP, Smith RI, Austen M (2013) Integrating Health & Environmental Impact Analysis. Public Health.
NASA Astrophysics Data System (ADS)
Maity, Debotyam
This study is aimed at an improved understanding of unconventional reservoirs which include tight reservoirs (such as shale oil and gas plays), geothermal developments, etc. We provide a framework for improved fracture zone identification and mapping of the subsurface for a geothermal system by integrating data from different sources. The proposed ideas and methods were tested primarily on data obtained from North Brawley geothermal field and the Geysers geothermal field apart from synthetic datasets which were used to test new algorithms before actual application on the real datasets. The study has resulted in novel or improved algorithms for use at specific stages of data acquisition and analysis including improved phase detection technique for passive seismic (and teleseismic) data as well as optimization of passive seismic surveys for best possible processing results. The proposed workflow makes use of novel integration methods as a means of making best use of the available geophysical data for fracture characterization. The methodology incorporates soft computing tools such as hybrid neural networks (neuro-evolutionary algorithms) as well as geostatistical simulation techniques to improve the property estimates as well as overall characterization efficacy. The basic elements of the proposed characterization workflow involves using seismic and microseismic data to characterize structural and geomechanical features within the subsurface. We use passive seismic data to model geomechanical properties which are combined with other properties evaluated from seismic and well logs to derive both qualitative and quantitative fracture zone identifiers. The study has resulted in a broad framework highlighting a new technique for utilizing geophysical data (seismic and microseismic) for unconventional reservoir characterization. It provides an opportunity to optimally develop the resources in question by incorporating data from different sources and using their temporal and spatial variability as a means to better understand the reservoir behavior. As part of this study, we have developed the following elements which are discussed in the subsequent chapters: 1. An integrated characterization framework for unconventional settings with adaptable workflows for all stages of data processing, interpretation and analysis. 2. A novel autopicking workflow for noisy passive seismic data used for improved accuracy in event picking as well as for improved velocity model building. 3. Improved passive seismic survey design optimization framework for better data collection and improved property estimation. 4. Extensive post-stack seismic attribute studies incorporating robust schemes applicable in complex reservoir settings. 5. Uncertainty quantification and analysis to better quantify property estimates over and above the qualitative interpretations made and to validate observations independently with quantified uncertainties to prevent erroneous interpretations. 6. Property mapping from microseismic data including stress and anisotropic weakness estimates for integrated reservoir characterization and analysis. 7. Integration of results (seismic, microseismic and well logs) from analysis of individual data sets for integrated interpretation using predefined integration framework and soft computing tools.
A negotiation methodology and its application to cogeneration planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, S.M.; Liu, C.C.; Luu, S.
Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.
Pedrini, Paolo; Bragalanti, Natalia; Groff, Claudio
2017-01-01
Recently-developed methods that integrate multiple data sources arising from the same ecological processes have typically utilized structured data from well-defined sampling protocols (e.g., capture-recapture and telemetry). Despite this new methodological focus, the value of opportunistic data for improving inference about spatial ecological processes is unclear and, perhaps more importantly, no procedures are available to formally test whether parameter estimates are consistent across data sources and whether they are suitable for integration. Using data collected on the reintroduced brown bear population in the Italian Alps, a population of conservation importance, we combined data from three sources: traditional spatial capture-recapture data, telemetry data, and opportunistic data. We developed a fully integrated spatial capture-recapture (SCR) model that included a model-based test for data consistency to first compare model estimates using different combinations of data, and then, by acknowledging data-type differences, evaluate parameter consistency. We demonstrate that opportunistic data lend itself naturally to integration within the SCR framework and highlight the value of opportunistic data for improving inference about space use and population size. This is particularly relevant in studies of rare or elusive species, where the number of spatial encounters is usually small and where additional observations are of high value. In addition, our results highlight the importance of testing and accounting for inconsistencies in spatial information from structured and unstructured data so as to avoid the risk of spurious or averaged estimates of space use and consequently, of population size. Our work supports the use of a single modeling framework to combine spatially-referenced data while also accounting for parameter consistency. PMID:28973034
Intersection of migration and turnover theories-What can we learn?
Brewer, Carol S; Kovner, Christine T
2014-01-01
The international migration of nurses has become a major issue in the international health and workforce policy circles, but analyses are not based on a comprehensive theory. The purpose of this article was to compare the concepts of an integrated nursing turnover theory with the concepts of one international migration framework. An integrated turnover theory is compared with a frequently used migration framework using examples of each. Migration concepts relate well to turnover concepts, but the relative importance and strength of various concepts may differ. For example, identification, development, and measurement of the concept of national commitment, if it exists, is parallel to organizational commitment and may be fruitful in understanding the processes that lead to nurse migration. The turnover theory provides a framework for examining migration concepts and considering how these concepts could relate to each other in a future theory of migration. Ultimately, a better understanding of the relationships and strengths of these concepts could lead to more effective policy. Copyright © 2014 Elsevier Inc. All rights reserved.
Ramstead, Maxwell J. D.; Veissière, Samuel P. L.; Kirmayer, Laurence J.
2016-01-01
In this paper we outline a framework for the study of the mechanisms involved in the engagement of human agents with cultural affordances. Our aim is to better understand how culture and context interact with human biology to shape human behavior, cognition, and experience. We attempt to integrate several related approaches in the study of the embodied, cognitive, and affective substrates of sociality and culture and the sociocultural scaffolding of experience. The integrative framework we propose bridges cognitive and social sciences to provide (i) an expanded concept of ‘affordance’ that extends to sociocultural forms of life, and (ii) a multilevel account of the socioculturally scaffolded forms of affordance learning and the transmission of affordances in patterned sociocultural practices and regimes of shared attention. This framework provides an account of how cultural content and normative practices are built on a foundation of contentless basic mental processes that acquire content through immersive participation of the agent in social practices that regulate joint attention and shared intentionality. PMID:27507953
The Gendered Family Process Model: An Integrative Framework of Gender in the Family.
Endendijk, Joyce J; Groeneveld, Marleen G; Mesman, Judi
2018-05-01
This article reviews and integrates research on gender-related biological, cognitive, and social processes that take place in or between family members, resulting in a newly developed gendered family process (GFP) model. The GFP model serves as a guiding framework for research on gender in the family context, calling for the integration of biological, social, and cognitive factors. Biological factors in the model are prenatal, postnatal, and pubertal androgen levels of children and parents, and genetic effects on parent and child gendered behavior. Social factors are family sex composition (i.e., parent sex, sexual orientation, marriage status, sibling sex composition) and parental gender socialization, such as modeling, gender-differentiated parenting, and gender talk. Cognitive factors are implicit and explicit gender-role cognitions of parents and children. Our review and the GFP model confirm that gender is an important organizer of family processes, but also highlight that much is still unclear about the mechanisms underlying gender-related processes within the family context. Therefore, we stress the need for (1) longitudinal studies that take into account the complex bidirectional relationship between parent and child gendered behavior and cognitions, in which within-family comparisons (comparing behavior of parents toward a boy and a girl in the same family) are made instead of between-family comparisons (comparing parenting between all-boy families and all-girl families, or between mixed-gender families and same-gender families), (2) experimental studies on the influence of testosterone on human gender development, (3) studies examining the interplay between biology with gender socialization and gender-role cognitions in humans.
Modeling and Advanced Control for Sustainable Process ...
This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.
Naghettini, Alessandra V; Bollela, Valdes R; Costa, Nilce M S C; Salgado, Luciana M R
2011-01-01
To describe the process of integration and revision of a pediatric program curriculum which resulted in the creation of a competency-based framework recommended in the Brazilian National Curricular Guidelines. Quali-quantitative analysis of an intervention evaluating the students and professors' perception of the pediatric program curriculum (focus groups and semi-structured interviews). Results were discussed during teaching development workshops. A competency-based framework was suggested for the pediatric program from the 3rd to the 6th year. The new curriculum was approved, implemented, and reevaluated six months later. Twelve students (12%) from the 3rd to the 6th year participated in the focus groups, and 11 professors (78.5%) answered the questionnaire. Most participants reported lack of integration among the courses, lack of knowledge about the learning goals of the internships, few opportunities of practice, and predominance of theoretical evaluation. In the training workshops, a competency-based curriculum was created after pediatrics and collective health professors reached an agreement. The new curriculum was focused on general competency, learning goals, opportunities available to learn these goals, and evaluation system. After six months, 93% (104/112) of students and 79% (11/14) of professors reported greater integration of the program and highlighted the inclusion of the clinical performance evaluation. The collective creation of a competency-based curriculum promoted higher satisfaction of students and professors. After being implemented, the new curriculum was considered to integrate the teaching practices and contents, improving the quality of the clinical performance evaluation.
Debaveye, Sam; De Soete, Wouter; De Meester, Steven; Vandijck, Dominique; Heirman, Bert; Kavanagh, Shane; Dewulf, Jo
2016-01-01
The effects of a pharmaceutical treatment have until now been evaluated by the field of Health Economics on the patient health benefits, expressed in Quality-Adjusted Life Years (QALYs) versus the monetary costs. However, there is also a Human Health burden associated with this process, resulting from emissions that originate from the pharmaceutical production processes, Use Phase and End of Life (EoL) disposal of the medicine. This Human Health burden is evaluated by the research field of Life Cycle Assessment (LCA) and expressed in Disability-Adjusted Life Years (DALYs), a metric similar to the QALY. The need for a new framework presents itself in which both the positive and negative health effects of a pharmaceutical treatment are integrated into a net Human Health effect. To do so, this article reviews the methodologies of both Health Economics and the area of protection Human Health of the LCA methodology and proposes a conceptual framework on which to base an integration of both health effects. Methodological issues such as the inclusion of future costs and benefits, discounting and age weighting are discussed. It is suggested to use the structure of an LCA as a backbone to cover all methodological challenges involved in the integration. The possibility of monetizing both Human Health benefits and burdens is explored. The suggested approach covers the main methodological aspects that should be considered in an integrated assessment of the health effects of a pharmaceutical treatment. Copyright © 2015 Elsevier Inc. All rights reserved.
2014-01-01
Background The Medical Research Councils’ framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. Methods We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Results Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Conclusions Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions, thereby increasing the likelihood that the intervention will be ultimately effective, sustainable and scalable. We urge researchers developing and evaluating complex interventions to consider using this approach, to evaluate its usefulness and to build an evidence base to further refine the methodology. Trial registration Clinical trials.gov: NCT02160249 PMID:24996765
De Silva, Mary J; Breuer, Erica; Lee, Lucy; Asher, Laura; Chowdhary, Neerja; Lund, Crick; Patel, Vikram
2014-07-05
The Medical Research Councils' framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions, thereby increasing the likelihood that the intervention will be ultimately effective, sustainable and scalable. We urge researchers developing and evaluating complex interventions to consider using this approach, to evaluate its usefulness and to build an evidence base to further refine the methodology. Clinical trials.gov: NCT02160249.