Sample records for information framework model

  1. PACS/information systems interoperability using Enterprise Communication Framework.

    PubMed

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  2. Moral judgment as information processing: an integrative review.

    PubMed

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  3. Moral judgment as information processing: an integrative review

    PubMed Central

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  4. A Framework to Manage Information Models

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.

    2008-05-01

    The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such as OWL and RDF and written to XML Metadata Interchange (XMI) files for import into UML tools.

  5. Model-theoretic framework for sensor data fusion

    NASA Astrophysics Data System (ADS)

    Zavoleas, Kyriakos P.; Kokar, Mieczyslaw M.

    1993-09-01

    The main goal of our research in sensory data fusion (SDF) is the development of a systematic approach (a methodology) to designing systems for interpreting sensory information and for reasoning about the situation based upon this information and upon available data bases and knowledge bases. To achieve such a goal, two kinds of subgoals have been set: (1) develop a theoretical framework in which rational design/implementation decisions can be made, and (2) design a prototype SDF system along the lines of the framework. Our initial design of the framework has been described in our previous papers. In this paper we concentrate on the model-theoretic aspects of this framework. We postulate that data are embedded in data models, and information processing mechanisms are embedded in model operators. The paper is devoted to analyzing the classes of model operators and their significance in SDF. We investigate transformation abstraction and fusion operators. A prototype SDF system, fusing data from range and intensity sensors, is presented, exemplifying the structures introduced. Our framework is justified by the fact that it provides modularity, traceability of information flow, and a basis for a specification language for SDF.

  6. The Smoothed Dirichlet Distribution: Understanding Cross-Entropy Ranking in Information Retrieval

    DTIC Science & Technology

    2006-07-01

    reflect those of the spon- sor. viii ABSTRACT Unigram Language modeling is a successful probabilistic framework for Information Retrieval (IR) that uses...the Relevance model (RM), a state-of-the-art model for IR in the language modeling framework that uses the same cross-entropy as its ranking function...In addition, the SD based classifier provides more flexibility than RM in modeling documents owing to a consistent generative framework . We

  7. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  8. A modeling framework for characterizing near-road air pollutant concentration at community scales

    EPA Science Inventory

    In this study, we combine information from transportation network, traffic emissions, and dispersion model to develop a framework to inform exposure estimates for traffic-related air pollutants (TRAPs) with a high spatial resolution. A Research LINE source dispersion model (R-LIN...

  9. A Strategic Approach to Curriculum Design for Information Literacy in Teacher Education--Implementing an Information Literacy Conceptual Framework

    ERIC Educational Resources Information Center

    Klebansky, Anna; Fraser, Sharon P.

    2013-01-01

    This paper details a conceptual framework that situates curriculum design for information literacy and lifelong learning, through a cohesive developmental information literacy based model for learning, at the core of teacher education courses at UTAS. The implementation of the framework facilitates curriculum design that systematically,…

  10. Health level 7 development framework for medication administration.

    PubMed

    Kim, Hwa Sun; Cho, Hune

    2009-01-01

    We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.

  11. Public Acceptance and User Response to ATIS Products and Services: Modeling Framework and Data Requirements

    DOT National Transportation Integrated Search

    1993-12-01

    This report presents a comprehensive modeling framework for user responses to Advanced Traveler Information Systems (ATIS) services and identifies the data needs for the validation of such a framework. The authors present overviews of the framework b...

  12. Modelling Participatory Geographic Information System for Customary Land Conflict Resolution

    NASA Astrophysics Data System (ADS)

    Gyamera, E. A.; Arko-Adjei, A.; Duncan, E. E.; Kuma, J. S. Y.

    2017-11-01

    Since land contributes to about 73 % of most countries Gross Domestic Product (GDP), attention on land rights have tremendously increased globally. Conflicts over land have therefore become part of the major problems associated with land administration. However, the conventional mechanisms for land conflict resolution do not provide satisfactory result to disputants due to various factors. This study sought to develop a Framework of using Participatory Geographic Information System (PGIS) for customary land conflict resolution. The framework was modelled using Unified Modelling Language (UML). The PGIS framework, called butterfly model, consists of three units namely, Social Unit (SU), Technical Unit (TU) and Decision Making Unit (DMU). The name butterfly model for land conflict resolution was adopted for the framework based on its features and properties. The framework has therefore been recommended to be adopted for land conflict resolution in customary areas.

  13. Field Markup Language: biological field representation in XML.

    PubMed

    Chang, David; Lovell, Nigel H; Dokos, Socrates

    2007-01-01

    With an ever increasing number of biological models available on the internet, a standardized modeling framework is required to allow information to be accessed or visualized. Based on the Physiome Modeling Framework, the Field Markup Language (FML) is being developed to describe and exchange field information for biological models. In this paper, we describe the basic features of FML, its supporting application framework and its ability to incorporate CellML models to construct tissue-scale biological models. As a typical application example, we present a spatially-heterogeneous cardiac pacemaker model which utilizes both FML and CellML to describe and solve the underlying equations of electrical activation and propagation.

  14. Toward improved calibration of watershed models: multisite many objective measures of information

    USDA-ARS?s Scientific Manuscript database

    This paper presents a computational framework for incorporation of disparate information from observed hydrologic responses at multiple locations into the calibration of watershed models. The framework consists of four components: (i) an a-priori characterization of system behavior; (ii) a formal an...

  15. NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information

    USGS Publications Warehouse

    ,

    2004-01-01

    Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.

  16. Health, Supportive Environments, and the Reasonable Person Model

    Treesearch

    Stephen Kaplan; Rachel Kaplan

    2003-01-01

    The Reasonable Person Model is a conceptual framework that links environmental factors with human behavior. People are more reasonable, cooperative, helpful, and satisfied when the environment supports their basic informational needs. The same environmental supports are important factors in enhancing human health. We use this framework to identify the informational...

  17. Design of a framework for modeling, integration and simulation of physiological models.

    PubMed

    Erson, E Zeynep; Cavuşoğlu, M Cenk

    2012-09-01

    Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. A computational framework for modeling targets as complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  19. Modeling Nonlinear Change via Latent Change and Latent Acceleration Frameworks: Examining Velocity and Acceleration of Growth Trajectories

    ERIC Educational Resources Information Center

    Grimm, Kevin; Zhang, Zhiyong; Hamagami, Fumiaki; Mazzocco, Michele

    2013-01-01

    We propose the use of the latent change and latent acceleration frameworks for modeling nonlinear growth in structural equation models. Moving to these frameworks allows for the direct identification of "rates of change" and "acceleration" in latent growth curves--information available indirectly through traditional growth…

  20. Improve Biomedical Information Retrieval using Modified Learning to Rank Methods.

    PubMed

    Xu, Bo; Lin, Hongfei; Lin, Yuan; Ma, Yunlong; Yang, Liang; Wang, Jian; Yang, Zhihao

    2016-06-14

    In these years, the number of biomedical articles has increased exponentially, which becomes a problem for biologists to capture all the needed information manually. Information retrieval technologies, as the core of search engines, can deal with the problem automatically, providing users with the needed information. However, it is a great challenge to apply these technologies directly for biomedical retrieval, because of the abundance of domain specific terminologies. To enhance biomedical retrieval, we propose a novel framework based on learning to rank. Learning to rank is a series of state-of-the-art information retrieval techniques, and has been proved effective in many information retrieval tasks. In the proposed framework, we attempt to tackle the problem of the abundance of terminologies by constructing ranking models, which focus on not only retrieving the most relevant documents, but also diversifying the searching results to increase the completeness of the resulting list for a given query. In the model training, we propose two novel document labeling strategies, and combine several traditional retrieval models as learning features. Besides, we also investigate the usefulness of different learning to rank approaches in our framework. Experimental results on TREC Genomics datasets demonstrate the effectiveness of our framework for biomedical information retrieval.

  1. Computational Model for Ethnographically Informed Systems Design

    NASA Astrophysics Data System (ADS)

    Iqbal, Rahat; James, Anne; Shah, Nazaraf; Terken, Jacuqes

    This paper presents a computational model for ethnographically informed systems design that can support complex and distributed cooperative activities. This model is based on an ethnographic framework consisting of three important dimensions (e.g., distributed coordination, awareness of work and plans and procedure), and the BDI (Belief, Desire and Intention) model of intelligent agents. The ethnographic framework is used to conduct ethnographic analysis and to organise ethnographically driven information into three dimensions, whereas the BDI model allows such information to be mapped upon the underlying concepts of multi-agent systems. The advantage of this model is that it is built upon an adaptation of existing mature and well-understood techniques. By the use of this model, we also address the cognitive aspects of systems design.

  2. Information system modeling for biomedical imaging applications

    NASA Astrophysics Data System (ADS)

    Hoo, Kent S., Jr.; Wong, Stephen T. C.

    1999-07-01

    Information system modeling has historically been relegated to a low priority among the designers of information systems. Often times, there is a rush to design and implement hardware and software solutions after only the briefest assessments of the domain requirements. Although this process results in a rapid development cycle, the system usually does not satisfy the needs of the users and the developers are forced to re-program certain aspects of the system. It would be much better to create an accurate model of the system based on the domain needs so that the implementation of the solution satisfies the needs of the users immediately. It would also be advantageous to build extensibility into the model so that updates to the system could be carried out in an organized fashion. The significance of this research is the development of a new formal framework for the construction of a multimedia medical information system. This formal framework is constructed using visual modeling which provides a way of thinking about problems using models organized around real- world ideas. These models provide an abstract way to view complex problems, making them easier for one to understand. The formal framework is the result of an object-oriented analysis and design process that translates the systems requirements and functionality into software models. The usefulness of this information framework is demonstrated with two different applications in epilepsy research and care, i.e., surgical planning of epilepsy and decision threshold determination.

  3. Multi-level multi-task learning for modeling cross-scale interactions in nested geospatial data

    USGS Publications Warehouse

    Yuan, Shuai; Zhou, Jiayu; Tan, Pang-Ning; Fergus, Emi; Wagner, Tyler; Sorrano, Patricia

    2017-01-01

    Predictive modeling of nested geospatial data is a challenging problem as the models must take into account potential interactions among variables defined at different spatial scales. These cross-scale interactions, as they are commonly known, are particularly important to understand relationships among ecological properties at macroscales. In this paper, we present a novel, multi-level multi-task learning framework for modeling nested geospatial data in the lake ecology domain. Specifically, we consider region-specific models to predict lake water quality from multi-scaled factors. Our framework enables distinct models to be developed for each region using both its local and regional information. The framework also allows information to be shared among the region-specific models through their common set of latent factors. Such information sharing helps to create more robust models especially for regions with limited or no training data. In addition, the framework can automatically determine cross-scale interactions between the regional variables and the local variables that are nested within them. Our experimental results show that the proposed framework outperforms all the baseline methods in at least 64% of the regions for 3 out of 4 lake water quality datasets evaluated in this study. Furthermore, the latent factors can be clustered to obtain a new set of regions that is more aligned with the response variables than the original regions that were defined a priori from the ecology domain.

  4. Theories, models and frameworks used in capacity building interventions relevant to public health: a systematic review.

    PubMed

    Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather

    2017-11-28

    There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners with a menu of potentially usable theories, models and frameworks to support capacity building efforts. The findings also support the need for the use of theories, models or frameworks to be intentional, explicitly identified, referenced and for it to be clearly outlined how they were applied to the capacity building intervention.

  5. Public Management Information Systems: Theory and Prescription.

    ERIC Educational Resources Information Center

    Bozeman, Barry; Bretschneider, Stuart

    1986-01-01

    The existing theoretical framework for research in management information systems (MIS) is criticized for its lack of attention to the external environment of organizations, and a new framework is developed which better accommodates MIS in public organizations: public management information systems. Four models of publicness that reflect external…

  6. Stochastic information transfer from cochlear implant electrodes to auditory nerve fibers

    NASA Astrophysics Data System (ADS)

    Gao, Xiao; Grayden, David B.; McDonnell, Mark D.

    2014-08-01

    Cochlear implants, also called bionic ears, are implanted neural prostheses that can restore lost human hearing function by direct electrical stimulation of auditory nerve fibers. Previously, an information-theoretic framework for numerically estimating the optimal number of electrodes in cochlear implants has been devised. This approach relies on a model of stochastic action potential generation and a discrete memoryless channel model of the interface between the array of electrodes and the auditory nerve fibers. Using these models, the stochastic information transfer from cochlear implant electrodes to auditory nerve fibers is estimated from the mutual information between channel inputs (the locations of electrodes) and channel outputs (the set of electrode-activated nerve fibers). Here we describe a revised model of the channel output in the framework that avoids the side effects caused by an "ambiguity state" in the original model and also makes fewer assumptions about perceptual processing in the brain. A detailed comparison of how different assumptions on fibers and current spread modes impact on the information transfer in the original model and in the revised model is presented. We also mathematically derive an upper bound on the mutual information in the revised model, which becomes tighter as the number of electrodes increases. We found that the revised model leads to a significantly larger maximum mutual information and corresponding number of electrodes compared with the original model and conclude that the assumptions made in this part of the modeling framework are crucial to the model's overall utility.

  7. Ecological Modelling of Individual and Contextual Influences: A Person-in-Environment Framework for Hypothetico-Deductive Information Behaviour Research

    ERIC Educational Resources Information Center

    Sin, Sei-Ching Joanna

    2015-01-01

    Introduction: This paper discusses the person-in-environment framework, which proposes the inclusion of environmental factors, alongside personal factors, as the explanatory factors of individual-level information behaviour and outcome. Method: The paper first introduces the principles and schematic formulas of the person-in-environment framework.…

  8. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm.

    PubMed

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.

  9. Closed-Loop Lifecycle Management of Service and Product in the Internet of Things: Semantic Framework for Knowledge Integration.

    PubMed

    Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris

    2016-07-08

    This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) BACKGROUND: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) METHODS: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) RESULTS: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) CONCLUSION: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database.

  10. Closed-Loop Lifecycle Management of Service and Product in the Internet of Things: Semantic Framework for Knowledge Integration

    PubMed Central

    Yoo, Min-Jung; Grozel, Clément; Kiritsis, Dimitris

    2016-01-01

    This paper describes our conceptual framework of closed-loop lifecycle information sharing for product-service in the Internet of Things (IoT). The framework is based on the ontology model of product-service and a type of IoT message standard, Open Messaging Interface (O-MI) and Open Data Format (O-DF), which ensures data communication. (1) Background: Based on an existing product lifecycle management (PLM) methodology, we enhanced the ontology model for the purpose of integrating efficiently the product-service ontology model that was newly developed; (2) Methods: The IoT message transfer layer is vertically integrated into a semantic knowledge framework inside which a Semantic Info-Node Agent (SINA) uses the message format as a common protocol of product-service lifecycle data transfer; (3) Results: The product-service ontology model facilitates information retrieval and knowledge extraction during the product lifecycle, while making more information available for the sake of service business creation. The vertical integration of IoT message transfer, encompassing all semantic layers, helps achieve a more flexible and modular approach to knowledge sharing in an IoT environment; (4) Contribution: A semantic data annotation applied to IoT can contribute to enhancing collected data types, which entails a richer knowledge extraction. The ontology-based PLM model enables as well the horizontal integration of heterogeneous PLM data while breaking traditional vertical information silos; (5) Conclusion: The framework was applied to a fictive case study with an electric car service for the purpose of demonstration. For the purpose of demonstrating the feasibility of the approach, the semantic model is implemented in Sesame APIs, which play the role of an Internet-connected Resource Description Framework (RDF) database. PMID:27399717

  11. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm

    PubMed Central

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850

  12. Information of Complex Systems and Applications in Agent Based Modeling.

    PubMed

    Bao, Lei; Fritchman, Joseph C

    2018-04-18

    Information about a system's internal interactions is important to modeling the system's dynamics. This study examines the finer categories of the information definition and explores the features of a type of local information that describes the internal interactions of a system. Based on the results, a dual-space agent and information modeling framework (AIM) is developed by explicitly distinguishing an information space from the material space. The two spaces can evolve both independently and interactively. The dual-space framework can provide new analytic methods for agent based models (ABMs). Three examples are presented including money distribution, individual's economic evolution, and artificial stock market. The results are analyzed in the dual-space, which more clearly shows the interactions and evolutions within and between the information and material spaces. The outcomes demonstrate the wide-ranging applicability of using the dual-space AIMs to model and analyze a broad range of interactive and intelligent systems.

  13. FREIDA (framework of resources for modeling energy/environmental/economic impacts of development and advancements) in ports”: A portfolio of interactive information resources, and an illustrative energy sector analysis

    EPA Science Inventory

    This paper presents preliminary results from our ongoing work on the development of “FREIDA in Ports”: an interactive information resource and modeling framework for port communities, that may be used to enhance resilience to climate change and enable sustainable deve...

  14. Validation of an Information-Motivation-Behavioral Skills model of diabetes self-care (IMB-DSC).

    PubMed

    Osborn, Chandra Y; Egede, Leonard E

    2010-04-01

    Comprehensive behavior change frameworks are needed to provide guidance for the design, implementation, and evaluation of diabetes self-care programs in diverse populations. We applied the Information-Motivation-Behavioral Skills (IMB) model, a well-validated, comprehensive health behavior change framework, to diabetes self-care. Patients with diabetes were recruited from an outpatient clinic. Information gathered pertained to demographics, diabetes knowledge (information); diabetes fatalism (personal motivation); social support (social motivation); and diabetes self-care (behavior). Hemoglobin A1C values were extracted from the patient medical record. Structural equation models tested the IMB framework. More diabetes knowledge (r=0.22 p<0.05), less fatalistic attitudes (r=-0.20, p<0.05), and more social support (r=0.27, p<0.01) were independent, direct predictors of diabetes self-care behavior; and through behavior, were related to glycemic control (r=-0.20, p<0.05). Consistent with the IMB model, having more information (more diabetes knowledge), personal motivation (less fatalistic attitudes), and social motivation (more social support) was associated with behavior; and behavior was the sole predictor of glycemic control. The IMB model is an appropriate, comprehensive health behavior change framework for diabetes self-care. The findings indicate that in addition to knowledge, diabetes education programs should target personal and social motivation to effect behavior change. 2009 Elsevier Ireland Ltd. All rights reserved.

  15. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less

  16. Performance measurement integrated information framework in e-Manufacturing

    NASA Astrophysics Data System (ADS)

    Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José

    2014-11-01

    The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.

  17. Development of structured ICD-10 and its application to computer-assisted ICD coding.

    PubMed

    Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko

    2010-01-01

    This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.

  18. Predictive Models and Computational Embryology

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  19. Predictive Models and Computational Toxicology (II IBAMTOX)

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  20. Business model framework applications in health care: A systematic review.

    PubMed

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-11-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  1. Systematizing Web Search through a Meta-Cognitive, Systems-Based, Information Structuring Model (McSIS)

    ERIC Educational Resources Information Center

    Abuhamdieh, Ayman H.; Harder, Joseph T.

    2015-01-01

    This paper proposes a meta-cognitive, systems-based, information structuring model (McSIS) to systematize online information search behavior based on literature review of information-seeking models. The General Systems Theory's (GST) prepositions serve as its framework. Factors influencing information-seekers, such as the individual learning…

  2. A Response to the Review of the Community of Inquiry Framework

    ERIC Educational Resources Information Center

    Akyol, Zehra; Arbaugh, J. Ben; Cleveland-Innes, Marti; Garrison, D. Randy; Ice, Phil; Richardson, Jennifer C.; Swan, Karen

    2009-01-01

    The Community of Inquiry (CoI) framework has become a prominent model of teaching and learning in online and blended learning environments. Considerable research has been conducted which employs the framework with promising results, resulting in wide use to inform the practice of online and blended teaching and learning. For the CoI model to…

  3. Refreshing Information Literacy: Learning from Recent British Information Literacy Models

    ERIC Educational Resources Information Center

    Martin, Justine

    2013-01-01

    Models play an important role in helping practitioners implement and promote information literacy. Over time models can lose relevance with the advances in technology, society, and learning theory. Practitioners and scholars often call for adaptations or transformations of these frameworks to articulate the learning needs in information literacy…

  4. The Perceptions of U.S.-Based IT Security Professionals about the Effectiveness of IT Security Frameworks: A Quantitative Study

    ERIC Educational Resources Information Center

    Warfield, Douglas L.

    2011-01-01

    The evolution of information technology has included new methodologies that use information technology to control and manage various industries and government activities. Information Technology has also evolved as its own industry with global networks of interconnectivity, such as the Internet, and frameworks, models, and methodologies to control…

  5. A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model

    NASA Astrophysics Data System (ADS)

    Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi

    Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.

  6. Comprehensive process model of clinical information interaction in primary care: results of a "best-fit" framework synthesis.

    PubMed

    Veinot, Tiffany C; Senteio, Charles R; Hanauer, David; Lowery, Julie C

    2018-06-01

    To describe a new, comprehensive process model of clinical information interaction in primary care (Clinical Information Interaction Model, or CIIM) based on a systematic synthesis of published research. We used the "best fit" framework synthesis approach. Searches were performed in PubMed, Embase, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, Library and Information Science Abstracts, Library, Information Science and Technology Abstracts, and Engineering Village. Two authors reviewed articles according to inclusion and exclusion criteria. Data abstraction and content analysis of 443 published papers were used to create a model in which every element was supported by empirical research. The CIIM documents how primary care clinicians interact with information as they make point-of-care clinical decisions. The model highlights 3 major process components: (1) context, (2) activity (usual and contingent), and (3) influence. Usual activities include information processing, source-user interaction, information evaluation, selection of information, information use, clinical reasoning, and clinical decisions. Clinician characteristics, patient behaviors, and other professionals influence the process. The CIIM depicts the complete process of information interaction, enabling a grasp of relationships previously difficult to discern. The CIIM suggests potentially helpful functionality for clinical decision support systems (CDSSs) to support primary care, including a greater focus on information processing and use. The CIIM also documents the role of influence in clinical information interaction; influencers may affect the success of CDSS implementations. The CIIM offers a new framework for achieving CDSS workflow integration and new directions for CDSS design that can support the work of diverse primary care clinicians.

  7. Research governance: implications for health library and information professionals.

    PubMed

    Sen, Barbara A

    2003-03-01

    The Research Governance Framework for Health and Social Care published by the Department of Health in 2001 provides a model of best practice and a framework for research in the health and social care sector. This article reviews the Department of Health Research Governance Framework, discusses the implications of research governance for library and information professionals undertaking research in the health- and social-care sector and recommends strategies for best practice within the information profession relating to research governance. The scope of the Framework document that covers both clinical and non-clinical research is outlined. Any research involving, amongst other issues, patients, NHS staff and use or access to NHS premises may require ethics committee approval. Particular reference is made to the roles, responsibilities and professional conduct and the systems needed to support effective research practice. Issues such as these combine to encourage the development of a quality research culture which supports best practice. Questions arise regarding the training and experience of researchers, and access to the necessary information and support. The use of the Framework to guide research practice complements the quality issues within the evidence-based practice movement and supports the ongoing development of a quality research culture. Recommendations are given in relation to the document's five domains of ethics, science, information, health and safety and finance and intellectual property. Practical recommendations are offered for incorporating research governance into research practice in ways which conform to the Framework's standards and which are particularly relevant for research practitioners in information science. Concluding comments support the use of the Research Governance Framework as a model for best practice.

  8. The Goddard Snow Radiance Assimilation Project: An Integrated Snow Radiance and Snow Physics Modeling Framework for Snow/cold Land Surface Modeling

    NASA Technical Reports Server (NTRS)

    Kim, E.; Tedesco, M.; Reichle, R.; Choudhury, B.; Peters-Lidard C.; Foster, J.; Hall, D.; Riggs, G.

    2006-01-01

    Microwave-based retrievals of snow parameters from satellite observations have a long heritage and have so far been generated primarily by regression-based empirical "inversion" methods based on snapshots in time. Direct assimilation of microwave radiance into physical land surface models can be used to avoid errors associated with such retrieval/inversion methods, instead utilizing more straightforward forward models and temporal information. This approach has been used for years for atmospheric parameters by the operational weather forecasting community with great success. Recent developments in forward radiative transfer modeling, physical land surface modeling, and land data assimilation are converging to allow the assembly of an integrated framework for snow/cold lands modeling and radiance assimilation. The objective of the Goddard snow radiance assimilation project is to develop such a framework and explore its capabilities. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. In fact, multiple models are available for each element enabling optimization to match the needs of a particular study. Together these form a modular and flexible framework for self-consistent, physically-based remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster. Capabilities for assimilation of snow retrieval products are already under development for LIS. We will describe plans to add radiance-based assimilation capabilities. Plans for validation activities using field measurements will also be discussed.

  9. Prediction and Informative Risk Factor Selection of Bone Diseases.

    PubMed

    Li, Hui; Li, Xiaoyi; Ramanathan, Murali; Zhang, Aidong

    2015-01-01

    With the booming of healthcare industry and the overwhelming amount of electronic health records (EHRs) shared by healthcare institutions and practitioners, we take advantage of EHR data to develop an effective disease risk management model that not only models the progression of the disease, but also predicts the risk of the disease for early disease control or prevention. Existing models for answering these questions usually fall into two categories: the expert knowledge based model or the handcrafted feature set based model. To fully utilize the whole EHR data, we will build a framework to construct an integrated representation of features from all available risk factors in the EHR data and use these integrated features to effectively predict osteoporosis and bone fractures. We will also develop a framework for informative risk factor selection of bone diseases. A pair of models for two contrast cohorts (e.g., diseased patients versus non-diseased patients) will be established to discriminate their characteristics and find the most informative risk factors. Several empirical results on a real bone disease data set show that the proposed framework can successfully predict bone diseases and select informative risk factors that are beneficial and useful to guide clinical decisions.

  10. RPD-based Hypothesis Reasoning for Cyber Situation Awareness

    NASA Astrophysics Data System (ADS)

    Yen, John; McNeese, Michael; Mullen, Tracy; Hall, David; Fan, Xiaocong; Liu, Peng

    Intelligence workers such as analysts, commanders, and soldiers often need a hypothesis reasoning framework to gain improved situation awareness of the highly dynamic cyber space. The development of such a framework requires the integration of interdisciplinary techniques, including supports for distributed cognition (human-in-the-loop hypothesis generation), supports for team collaboration (identification of information for hypothesis evaluation), and supports for resource-constrained information collection (hypotheses competing for information collection resources). We here describe a cognitively-inspired framework that is built upon Klein’s recognition-primed decision model and integrates the three components of Endsley’s situation awareness model. The framework naturally connects the logic world of tools for cyber situation awareness with the mental world of human analysts, enabling the perception, comprehension, and prediction of cyber situations for better prevention, survival, and response to cyber attacks by adapting missions at the operational, tactical, and strategic levels.

  11. Brain activity and cognition: a connection from thermodynamics and information theory.

    PubMed

    Collell, Guillem; Fauquet, Jordi

    2015-01-01

    The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity.

  12. Model and Interoperability using Meta Data Annotations

    NASA Astrophysics Data System (ADS)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside. While providing all those capabilities, a significant reduction in the size of the model source code was achieved. To support the benefit of annotations for a modeler, studies were conducted to evaluate the effectiveness of an annotation based framework approach with other modeling frameworks and libraries, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A typical hydrological model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks.

  13. Development of an "Alert Framework" Based on the Practices in the Medical Front.

    PubMed

    Sakata, Takuya; Araki, Kenji; Yamazaki, Tomoyoshi; Kawano, Koichi; Maeda, Minoru; Kushima, Muneo; Araki, Sanae

    2018-05-09

    At the University of Miyazaki Hospital (UMH), we have accumulated and semantically structured a vast amount of medical information since the activation of the electronic health record system approximately 10 years ago. With this medical information, we have decided to develop an alert system for aiding in medical treatment. The purpose of this investigation is to not only to integrate an alert framework into the electronic heath record system, but also to formulate a modeling method of this knowledge. A trial alert framework was developed for the staff in various occupational categories at the UMH. Based on findings of subsequent interviews, a more detailed and upgraded alert framework was constructed, resulting in the final model. Based on our current findings, an alert framework was developed with four major items. Based on the analysis of the medical practices from the trial model, it has been concluded that there are four major risk patterns that trigger the alert. Furthermore, the current alert framework contains detailed definitions which are easily substituted into the database, leading to easy implementation of the electronic health records.

  14. [On-line processing mechanisms in text comprehension: a theoretical review on constructing situation models].

    PubMed

    Iseki, Ryuta

    2004-12-01

    This article reviewed research on construction of situation models during reading. To position variety of research in overall process appropriately, an unitary framework was devised in terms of three theories for on-line processing: resonance process, event-indexing model, and constructionist theory. Resonance process was treated as a basic activation mechanism in the framework. Event-indexing model was regarded as a screening system which selected and encoded activated information in situation models along with situational dimensions. Constructionist theory was considered to have a supervisory role based on coherence and explanation. From a view of the unitary framework, some problems concerning each theory were examined and possible interpretations were given. Finally, it was pointed out that there were little theoretical arguments on associative processing at global level and encoding text- and inference-information into long-term memory.

  15. A framework for scalable parameter estimation of gene circuit models using structural information.

    PubMed

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-07-01

    Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.

  16. Probability judgments under ambiguity and conflict

    PubMed Central

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at “best” probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of “best” estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity. PMID:26042081

  17. Probability judgments under ambiguity and conflict.

    PubMed

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.

  18. Expertise, Task Complexity, and Artificial Intelligence: A Conceptual Framework.

    ERIC Educational Resources Information Center

    Buckland, Michael K.; Florian, Doris

    1991-01-01

    Examines the relationship between users' expertise, task complexity of information system use, and artificial intelligence to provide the basis for a conceptual framework for considering the role that artificial intelligence might play in information systems. Cognitive and conceptual models are discussed, and cost effectiveness is considered. (27…

  19. A data management infrastructure for bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Byun, Jaewook; Kim, Daeyoung; Sohn, Hoon; Bae, In Hwan; Law, Kincho H.

    2015-04-01

    This paper discusses a data management infrastructure framework for bridge monitoring applications. As sensor technologies mature and become economically affordable, their deployment for bridge monitoring will continue to grow. Data management becomes a critical issue not only for storing the sensor data but also for integrating with the bridge model to support other functions, such as management, maintenance and inspection. The focus of this study is on the effective data management of bridge information and sensor data, which is crucial to structural health monitoring and life cycle management of bridge structures. We review the state-of-the-art of bridge information modeling and sensor data management, and propose a data management framework for bridge monitoring based on NoSQL database technologies that have been shown useful in handling high volume, time-series data and to flexibly deal with unstructured data schema. Specifically, Apache Cassandra and Mongo DB are deployed for the prototype implementation of the framework. This paper describes the database design for an XML-based Bridge Information Modeling (BrIM) schema, and the representation of sensor data using Sensor Model Language (SensorML). The proposed prototype data management framework is validated using data collected from the Yeongjong Bridge in Incheon, Korea.

  20. Communities ready for takeoffIntegrating social assets for biofuel site-selection modeling.

    PubMed

    Rijkhoff, Sanne A M; Hoard, Season A; Gaffney, Michael J; Smith, Paul M

    2017-01-01

    Although much of the social science literature supports the importance of community assets for success in many policy areas, these assets are often overlooked when selecting communities for new infrastructure facilities. Extensive collaboration is crucial for the success of environmental and economic projects, yet it often is not adequately addressed when making siting decisions for new projects. This article develops a social asset framework that includes social, creative, and human capital to inform site-selection decisions. This framework is applied to the Northwest Advanced Renewables Alliance project to assess community suitability for biofuel-related developments. This framework is the first to take all necessary community assets into account, providing insight into successful site selection beyond current models. The framework not only serves as a model for future biorefinery projects but also guides tasks that depend on informed location selection for success.

  1. An evaluation framework for Health Information Systems: human, organization and technology-fit factors (HOT-fit).

    PubMed

    Yusof, Maryati Mohd; Kuljis, Jasna; Papazafeiropoulou, Anastasia; Stergioulas, Lampros K

    2008-06-01

    The realization of Health Information Systems (HIS) requires rigorous evaluation that addresses technology, human and organization issues. Our review indicates that current evaluation methods evaluate different aspects of HIS and they can be improved upon. A new evaluation framework, human, organization and technology-fit (HOT-fit) was developed after having conducted a critical appraisal of the findings of existing HIS evaluation studies. HOT-fit builds on previous models of IS evaluation--in particular, the IS Success Model and the IT-Organization Fit Model. This paper introduces the new framework for HIS evaluation that incorporates comprehensive dimensions and measures of HIS and provides a technological, human and organizational fit. Literature review on HIS and IS evaluation studies and pilot testing of developed framework. The framework was used to evaluate a Fundus Imaging System (FIS) of a primary care organization in the UK. The case study was conducted through observation, interview and document analysis. The main findings show that having the right user attitude and skills base together with good leadership, IT-friendly environment and good communication can have positive influence on the system adoption. Comprehensive, specific evaluation factors, dimensions and measures in the new framework (HOT-fit) are applicable in HIS evaluation. The use of such a framework is argued to be useful not only for comprehensive evaluation of the particular FIS system under investigation, but potentially also for any Health Information System in general.

  2. A geographic data model for representing ground water systems.

    PubMed

    Strassberg, Gil; Maidment, David R; Jones, Norm L

    2007-01-01

    The Arc Hydro ground water data model is a geographic data model for representing spatial and temporal ground water information within a geographic information system (GIS). The data model is a standardized representation of ground water systems within a spatial database that provides a public domain template for GIS users to store, document, and analyze commonly used spatial and temporal ground water data sets. This paper describes the data model framework, a simplified version of the complete ground water data model that includes two-dimensional and three-dimensional (3D) object classes for representing aquifers, wells, and borehole data, and the 3D geospatial context in which these data exist. The framework data model also includes tabular objects for representing temporal information such as water levels and water quality samples that are related with spatial features.

  3. Cognitive Theory within the Framework of an Information Processing Model and Learning Hierarchy: Viable Alternative to the Bloom-Mager System.

    ERIC Educational Resources Information Center

    Stahl, Robert J.

    This review of the current status of the human information processing model presents the Stahl Perceptual Information Processing and Operations Model (SPInPrOM) as a model of how thinking, memory, and the processing of information take place within the individual learner. A related system, the Domain of Cognition, is presented as an alternative to…

  4. A climate robust integrated modelling framework for regional impact assessment of climate change

    NASA Astrophysics Data System (ADS)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change scenarios developed by KNMI for precipitation and reference evapotranspiration according to Penman-Monteith. Special focus in the project was on the role of uncertainty. How valid is the information that is generated by this modelling framework? What are the most important uncertainties of the input data, how do they affect the results of the model chain and how can the uncertainties of the data, results, and model concepts be quantified and communicated? Besides these technical issues, an important part of the study was devoted to the perception of stakeholders. Stakeholder analysis and additional working sessions yielded insight into how the models, their results and the uncertainties are perceived, how the modelling framework and results connect to the stakeholders' information demands and what kind of additional information is needed for adequate support on decision making.

  5. Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.

    PubMed

    Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya

    2018-06-17

    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.

  6. Non-extensitivity vs. informative moments for financial models —A unifying framework and empirical results

    NASA Astrophysics Data System (ADS)

    Herrmann, K.

    2009-11-01

    Information-theoretic approaches still play a minor role in financial market analysis. Nonetheless, there have been two very similar approaches evolving during the last years, one in the so-called econophysics and the other in econometrics. Both generalize the notion of GARCH processes in an information-theoretic sense and are able to capture kurtosis better than traditional models. In this article we present both approaches in a more general framework. The latter allows the derivation of a wide range of new models. We choose a third model using an entropy measure suggested by Kapur. In an application to financial market data, we find that all considered models - with similar flexibility in terms of skewness and kurtosis - lead to very similar results.

  7. Information Literacy for Archives and Special Collections: Defining Outcomes

    ERIC Educational Resources Information Center

    Carini, Peter

    2016-01-01

    This article provides the framework for a set of standards and outcomes that would constitute information literacy with primary sources. Based on a working model used at Dartmouth College's Rauner Special Collections Library in Hanover, New Hampshire, these concepts create a framework for teaching with primary source materials intended to produce…

  8. A security framework for nationwide health information exchange based on telehealth strategy.

    PubMed

    Zaidan, B B; Haiqi, Ahmed; Zaidan, A A; Abdulnabi, Mohamed; Kiah, M L Mat; Muzamel, Hussaen

    2015-05-01

    This study focuses on the situation of health information exchange (HIE) in the context of a nationwide network. It aims to create a security framework that can be implemented to ensure the safe transmission of health information across the boundaries of care providers in Malaysia and other countries. First, a critique of the major elements of nationwide health information networks is presented from the perspective of security, along with such topics as the importance of HIE, issues, and main approaches. Second, a systematic evaluation is conducted on the security solutions that can be utilized in the proposed nationwide network. Finally, a secure framework for health information transmission is proposed within a central cloud-based model, which is compatible with the Malaysian telehealth strategy. The outcome of this analysis indicates that a complete security framework for a global structure of HIE is yet to be defined and implemented. Our proposed framework represents such an endeavor and suggests specific techniques to achieve this goal.

  9. The Predictive Relationship among the Community of Inquiry Framework, Perceived Learning and Online, and Graduate Students' Course Grades in Online Synchronous and Asynchronous Courses

    ERIC Educational Resources Information Center

    Rockinson-Szapkiw, Amanda J.; Wendt, Jillian; Wighting, Mervyn; Nisbet, Deanna

    2016-01-01

    The Community of Inquiry framework has been widely supported by research to provide a model of online learning that informs the design and implementation of distance learning courses. However, the relationship between elements of the CoI framework and perceived learning warrants further examination as a predictive model for online graduate student…

  10. An entropic framework for modeling economies

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  11. Multitask TSK fuzzy system modeling by mining intertask common hidden structure.

    PubMed

    Jiang, Yizhang; Chung, Fu-Lai; Ishibuchi, Hisao; Deng, Zhaohong; Wang, Shitong

    2015-03-01

    The classical fuzzy system modeling methods implicitly assume data generated from a single task, which is essentially not in accordance with many practical scenarios where data can be acquired from the perspective of multiple tasks. Although one can build an individual fuzzy system model for each task, the result indeed tells us that the individual modeling approach will get poor generalization ability due to ignoring the intertask hidden correlation. In order to circumvent this shortcoming, we consider a general framework for preserving the independent information among different tasks and mining hidden correlation information among all tasks in multitask fuzzy modeling. In this framework, a low-dimensional subspace (structure) is assumed to be shared among all tasks and hence be the hidden correlation information among all tasks. Under this framework, a multitask Takagi-Sugeno-Kang (TSK) fuzzy system model called MTCS-TSK-FS (TSK-FS for multiple tasks with common hidden structure), based on the classical L2-norm TSK fuzzy system, is proposed in this paper. The proposed model can not only take advantage of independent sample information from the original space for each task, but also effectively use the intertask common hidden structure among multiple tasks to enhance the generalization performance of the built fuzzy systems. Experiments on synthetic and real-world datasets demonstrate the applicability and distinctive performance of the proposed multitask fuzzy system model in multitask regression learning scenarios.

  12. On domain modelling of the service system with its application to enterprise information systems

    NASA Astrophysics Data System (ADS)

    Wang, J. W.; Wang, H. F.; Ding, J. L.; Furuta, K.; Kanno, T.; Ip, W. H.; Zhang, W. J.

    2016-01-01

    Information systems are a kind of service systems and they are throughout every element of a modern industrial and business system, much like blood in our body. Types of information systems are heterogeneous because of extreme uncertainty in changes in modern industrial and business systems. To effectively manage information systems, modelling of the work domain (or domain) of information systems is necessary. In this paper, a domain modelling framework for the service system is proposed and its application to the enterprise information system is outlined. The framework is defined based on application of a general domain modelling tool called function-context-behaviour-principle-state-structure (FCBPSS). The FCBPSS is based on a set of core concepts, namely: function, context, behaviour, principle, state and structure and system decomposition. Different from many other applications of FCBPSS in systems engineering, the FCBPSS is applied to both infrastructure and substance systems, which is novel and effective to modelling of service systems including enterprise information systems. It is to be noted that domain modelling of systems (e.g. enterprise information systems) is a key to integration of heterogeneous systems and to coping with unanticipated situations facing to systems.

  13. Integral Nursing: An Emerging Framework for Engaging the Evolution of the Profession.

    ERIC Educational Resources Information Center

    Fiandt, Kathryn; Forman, John; Megel, Mary Erickson; Pakieser, Ruth A.; Burge, Stephanie

    2003-01-01

    Proposes the Integral Nursing framework, which combines Wilber's All-Quadrant/All-Level model, a heuristic device to organize human experience, and the Spiral Dynamics model of human development organized around value memes or cultural units of information. Includes commentary by Beth L. Rodgers. (Contains 17 references.) (JOW)

  14. A Framework for Effective Assessment of Model-based Projections of Biodiversity to Inform the Next Generation of Global Conservation Targets

    NASA Astrophysics Data System (ADS)

    Myers, B.; Beard, T. D.; Weiskopf, S. R.; Jackson, S. T.; Tittensor, D.; Harfoot, M.; Senay, G. B.; Casey, K.; Lenton, T. M.; Leidner, A. K.; Ruane, A. C.; Ferrier, S.; Serbin, S.; Matsuda, H.; Shiklomanov, A. N.; Rosa, I.

    2017-12-01

    Biodiversity and ecosystems services underpin political targets for the conservation of biodiversity; however, previous incarnations of these biodiversity-related targets have not relied on integrated model based projections of possible outcomes based on climate and land use change. Although a few global biodiversity models are available, most biodiversity models lie along a continuum of geography and components of biodiversity. Model-based projections of the future of global biodiversity are critical to support policymakers in the development of informed global conservation targets, but the scientific community lacks a clear strategy for integrating diverse data streams in developing, and evaluating the performance of, such biodiversity models. Therefore, in this paper, we propose a framework for ongoing testing and refinement of model-based projections of biodiversity trends and change, by linking a broad variety of biodiversity models with data streams generated by advances in remote sensing, coupled with new and emerging in-situ observation technologies to inform development of essential biodiversity variables, future global biodiversity targets, and indicators. Our two main objectives are to (1) develop a framework for model testing and refining projections of a broad range of biodiversity models, focusing on global models, through the integration of diverse data streams and (2) identify the realistic outputs that can be developed and determine coupled approaches using remote sensing and new and emerging in-situ observations (e.g., metagenomics) to better inform the next generation of global biodiversity targets.

  15. Sequentially Executed Model Evaluation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  16. Brain activity and cognition: a connection from thermodynamics and information theory

    PubMed Central

    Collell, Guillem; Fauquet, Jordi

    2015-01-01

    The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity. PMID:26136709

  17. Enterprise application architecture development based on DoDAF and TOGAF

    NASA Astrophysics Data System (ADS)

    Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng

    2017-05-01

    For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.

  18. A post-Bertalanffy Systemics Healthcare Competitive Framework Proposal.

    PubMed

    Fiorini, Rodolfo A; Santacroce, Giulia F

    2014-01-01

    Health Information community can take advantage of a new evolutive categorization cybernetic framework. A systemic concept of principles organizing nature is proposed. It can be used as a multiscaling reference framework to develop successful and competitive antifragile system and new HRO information management strategies in advanced healthcare organization (HO) and high reliability organization (HRO) conveniently. Expected impacts are multifarious and quite articulated at different system scale level: major one is that, for the first time, Biomedical Engineering ideal system categorization levels can be matched exactly to practical system modeling interaction styles, with no paradigmatic operational ambiguity and information loss.

  19. A Security Audit Framework to Manage Information System Security

    NASA Astrophysics Data System (ADS)

    Pereira, Teresa; Santos, Henrique

    The widespread adoption of information and communication technology have promoted an increase dependency of organizations in the performance of their Information Systems. As a result, adequate security procedures to properly manage information security must be established by the organizations, in order to protect their valued or critical resources from accidental or intentional attacks, and ensure their normal activity. A conceptual security framework to manage and audit Information System Security is proposed and discussed. The proposed framework intends to assist organizations firstly to understand what they precisely need to protect assets and what are their weaknesses (vulnerabilities), enabling to perform an adequate security management. Secondly, enabling a security audit framework to support the organization to assess the efficiency of the controls and policy adopted to prevent or mitigate attacks, threats and vulnerabilities, promoted by the advances of new technologies and new Internet-enabled services, that the organizations are subject of. The presented framework is based on a conceptual model approach, which contains the semantic description of the concepts defined in information security domain, based on the ISO/IEC_JCT1 standards.

  20. A framework for improving the quality of health information on the world-wide-web and bettering public (e-)health: the MedCERTAIN approach.

    PubMed

    Eysenbach, G; Köhler, C; Yihune, G; Lampe, K; Cross, P; Brickley, D

    2001-01-01

    There has been considerable debate about the variable quality of health information on the world-wide-web and its impact on public health. While central authorities to regulate, control, censor, or centrally approve information, in-formation providers or websites are neither realistic nor desirable, public health professionals are interested in making systems available that direct patient streams to the best available information sources. National governments and medical societies have also recognized their responsibility to help users to identify "good quality" information sources. But what constitutes good quality, and how can such a system be implemented in a decentralized and democratic manner? This paper presents a model which combines aspects of consumer education, encouragement of best practices among information providers, self-labeling and external evaluations. The model is currently being implemented and evaluated in the MedCERTAIN project, funded by the European Union under the Action Plan for Safer Use of the Internet. The aim is to develop a technical and organisational infrastructure for a pilot system that allows consumers to access metainformation about web-sites and health information providers, including disclosure information from health providers and opinions of external evaluators. The paper explains the general conceptual framework of the model and presents preliminary experiences including results from an expert consensus meeting, where the framework was discussed.

  1. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  2. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  3. Airborne electromagnetic mapping of the base of aquifer in areas of western Nebraska

    USGS Publications Warehouse

    Abraham, Jared D.; Cannia, James C.; Bedrosian, Paul A.; Johnson, Michaela R.; Ball, Lyndsay B.; Sibray, Steven S.

    2012-01-01

    Airborne geophysical surveys of selected areas of the North and South Platte River valleys of Nebraska, including Lodgepole Creek valley, collected data to map aquifers and bedrock topography and thus improve the understanding of groundwater - surface-water relationships to be used in water-management decisions. Frequency-domain helicopter electromagnetic surveys, using a unique survey flight-line design, collected resistivity data that can be related to lithologic information for refinement of groundwater model inputs. To make the geophysical data useful to multidimensional groundwater models, numerical inversion converted measured data into a depth-dependent subsurface resistivity model. The inverted resistivity model, along with sensitivity analyses and test-hole information, is used to identify hydrogeologic features such as bedrock highs and paleochannels, to improve estimates of groundwater storage. The two- and three-dimensional interpretations provide the groundwater modeler with a high-resolution hydrogeologic framework and a quantitative estimate of framework uncertainty. The new hydrogeologic frameworks improve understanding of the flow-path orientation by refining the location of paleochannels and associated base of aquifer highs. These interpretations provide resource managers high-resolution hydrogeologic frameworks and quantitative estimates of framework uncertainty. The improved base of aquifer configuration represents the hydrogeology at a level of detail not achievable with previously available data.

  4. The Chain-Link Fence Model: A Framework for Creating Security Procedures

    ERIC Educational Resources Information Center

    Houghton, Robert F.

    2013-01-01

    A long standing problem in information technology security is how to help reduce the security footprint. Many specific proposals exist to address specific problems in information technology security. Most information technology solutions need to be repeatable throughout the course of an information systems lifecycle. The Chain-Link Fence Model is…

  5. Coupling a distributed hydrological model with detailed forest structural information for large-scale global change impact assessment

    NASA Astrophysics Data System (ADS)

    Eisner, Stephanie; Huang, Shaochun; Majasalmi, Titta; Bright, Ryan; Astrup, Rasmus; Beldring, Stein

    2017-04-01

    Forests are recognized for their decisive effect on landscape water balance with structural forest characteristics as stand density or species composition determining energy partitioning and dominant flow paths. However, spatial and temporal variability in forest structure is often poorly represented in hydrological modeling frameworks, in particular in regional to large scale hydrological modeling and impact analysis. As a common practice, prescribed land cover classes (including different generic forest types) are linked to parameter values derived from literature, or parameters are determined by calibration. While national forest inventory (NFI) data provide comprehensive, detailed information on hydrologically relevant forest characteristics, their potential to inform hydrological simulation over larger spatial domains is rarely exploited. In this study we present a modeling framework that couples the distributed hydrological model HBV with forest structural information derived from the Norwegian NFI and multi-source remote sensing data. The modeling framework, set up for the entire of continental Norway at 1 km spatial resolution, is explicitly designed to study the combined and isolated impacts of climate change, forest management and land use change on hydrological fluxes. We use a forest classification system based on forest structure rather than biomes which allows to implicitly account for impacts of forest management on forest structural attributes. In the hydrological model, different forest classes are represented by three parameters: leaf area index (LAI), mean tree height and surface albedo. Seasonal cycles of LAI and surface albedo are dynamically simulated to make the framework applicable under climate change conditions. Based on a hindcast for the pilot regions Nord-Trøndelag and Sør-Trøndelag, we show how forest management has affected regional hydrological fluxes during the second half of the 20th century as contrasted to climate variability.

  6. The HTA Core Model®-10 Years of Developing an International Framework to Share Multidimensional Value Assessment.

    PubMed

    Kristensen, Finn Børlum; Lampe, Kristian; Wild, Claudia; Cerbo, Marina; Goettsch, Wim; Becla, Lidia

    2017-02-01

    The HTA Core Model ® as a science-based framework for assessing dimensions of value was developed as a part of the European network for Health Technology Assessment project in the period 2006 to 2008 to facilitate production and sharing of health technology assessment (HTA) information, such as evidence on efficacy and effectiveness and patient aspects, to inform decisions. It covers clinical value as well as organizational, economic, and patient aspects of technologies and has been field-tested in two consecutive joint actions in the period 2010 to 2016. A large number of HTA institutions were involved in the work. The model has undergone revisions and improvement after iterations of piloting and can be used in a local, national, or international context to produce structured HTA information that can be taken forward by users into their own frameworks to fit their specific needs when informing decisions on technology. The model has a broad scope and offers a common ground to various stakeholders through offering a standard structure and a transparent set of proposed HTA questions. It consists of three main components: 1) the HTA ontology, 2) methodological guidance, and 3) a common reporting structure. It covers domains such as effectiveness, safety, and economics, and also includes domains covering organizational, patient, social, and legal aspects. There is a full model and a focused rapid relative effectiveness assessment model, and a third joint action is to continue till 2020. The HTA Core Model is now available for everyone around the world as a framework for assessing value. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. A deliberative framework to identify the need for real-life evidence building of new cancer drugs after interim funding decision.

    PubMed

    Leung, Leanne; de Lemos, Mário L; Kovacic, Laurel

    2017-01-01

    Background With the rising cost of new oncology treatments, it is no longer sustainable to base initial drug funding decisions primarily on prospective clinical trials as their performance in real-life populations are often difficult to determine. In British Columbia, an approach in evidence building is to retrospectively analyse patient outcomes using observational research on an ad hoc basis. Methods The deliberative framework was constructed in three stages: framework design, framework validation and treatment programme characterization, and key informant interview. Framework design was informed through a literature review and analyses of provincial and national decision-making processes. Treatment programmes funded between 2010 and 2013 were used for framework validation. A selection concordance rate of 80% amongst three reviewers was considered to be a validation of the framework. Key informant interviews were conducted to determine the utility of this deliberative framework. Results A multi-domain deliberative framework with 15 assessment parameters was developed. A selection concordance rate of 84.2% was achieved for content validation of the framework. Nine treatment programmes from five different tumour groups were selected for retrospective outcomes analysis. Five contributory factors to funding uncertainties were identified. Key informants agreed that the framework is a comprehensive tool that targets the key areas involved in the funding decision-making process. Conclusions The oncology-based deliberative framework can be routinely used to assess treatment programmes from the major tumour sites for retrospective outcomes analysis. Key informants indicate this is a value-added tool and will provide insight to the current prospective funding model.

  8. A framework for modelling the complexities of food and water security under globalisation

    NASA Astrophysics Data System (ADS)

    Dermody, Brian J.; Sivapalan, Murugesu; Stehfest, Elke; van Vuuren, Detlef P.; Wassen, Martin J.; Bierkens, Marc F. P.; Dekker, Stefan C.

    2018-01-01

    We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  9. Validating and Modelling Technological Pedagogical Content Knowledge Framework among Asian Preservice Teachers

    ERIC Educational Resources Information Center

    Chai, Ching Shing; Ng, Eugenia M. W.; Li, Wenhao; Hong, Huang-Yao; Koh, Joyce H. L.

    2013-01-01

    The Technological Pedagogical Content Knowledge (TPCK) framework has been adopted by many educational technologists and teacher educators for the research and development of knowledge about the pedagogical uses of Information and Communication Technologies (ICT) in classrooms. While the framework is potentially very important, efforts to survey…

  10. A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.

    2013-01-01

    This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…

  11. Engaging Youth Through Spatial Socio-Technical Storytelling, Participatory GIS, Agent-Based Modeling, Online Geogames and Action Projects

    NASA Astrophysics Data System (ADS)

    Poplin, A.; Shenk, L.; Krejci, C.; Passe, U.

    2017-09-01

    The main goal of this paper is to present the conceptual framework for engaging youth in urban planning activities that simultaneously create locally meaningful positive change. The framework for engaging youth interlinks the use of IT tools such as geographic information systems (GIS), agent-based modelling (ABM), online serious games, and mobile participatory geographic information systems with map-based storytelling and action projects. We summarize the elements of our framework and the first results gained in the program Community Growers established in a neighbourhood community of Des Moines, the capital of Iowa, USA. We conclude the paper with a discussion and future research directions.

  12. Simultaneous Semi-Distributed Model Calibration Guided by ...

    EPA Pesticide Factsheets

    Modelling approaches to transfer hydrologically-relevant information from locations with streamflow measurements to locations without such measurements continues to be an active field of research for hydrologists. The Pacific Northwest Hydrologic Landscapes (PNW HL) provide a solid conceptual classification framework based on our understanding of dominant processes. A Hydrologic Landscape code (5 letter descriptor based on physical and climatic properties) describes each assessment unit area, and these units average area 60km2. The core function of these HL codes is to relate and transfer hydrologically meaningful information between watersheds without the need for streamflow time series. We present a novel approach based on the HL framework to answer the question “How can we calibrate models across separate watersheds simultaneously, guided by our understanding of dominant processes?“. We should be able to apply the same parameterizations to assessment units of common HL codes if 1) the Hydrologic Landscapes contain hydrologic information transferable between watersheds at a sub-watershed-scale and 2) we use a conceptual hydrologic model and parameters that reflect the hydrologic behavior of a watershed. In this study, This work specifically tests the ability or inability to use HL-codes to inform and share model parameters across watersheds in the Pacific Northwest. EPA’s Western Ecology Division has published and is refining a framework for defining la

  13. The development of information processing biases in childhood anxiety: a review and exploration of its origins in parenting.

    PubMed

    Hadwin, Julie A; Garner, Matthew; Perez-Olivas, Gisela

    2006-11-01

    The aim of this paper is to explore parenting as one potential route through which information processing biases for threat develop in children. It reviews information processing biases in childhood anxiety in the context of theoretical models and empirical research in the adult anxiety literature. Specifically, it considers how adult models have been used and adapted to develop a theoretical framework with which to investigate information processing biases in children. The paper then considers research which specifically aims to understand the relationship between parenting and the development of information processing biases in children. It concludes that a clearer theoretical framework is required to understand the significance of information biases in childhood anxiety, as well as their origins in parenting.

  14. Incident Management in Academic Information System using ITIL Framework

    NASA Astrophysics Data System (ADS)

    Palilingan, V. R.; Batmetan, J. R.

    2018-02-01

    Incident management is very important in order to ensure the continuity of a system. Information systems require incident management to ensure information systems can provide maximum service according to the service provided. Many of the problems that arise in academic information systems come from incidents that are not properly handled. The objective of this study aims to find the appropriate way of incident management. The incident can be managed so it will not be a big problem. This research uses the ITIL framework to solve incident problems. The technique used in this study is a technique adopted and developed from the service operations section of the ITIL framework. The results of this research found that 84.5% of incidents appearing in academic information systems can be handled quickly and appropriately. 15.5% incidents can be escalated so as to not cause any new problems. The model of incident management applied to make academic information system can run quickly in providing academic service in a good and efficient. The incident management model implemented in this research is able to manage resources appropriately so as to quickly and easily manage incidents.

  15. Sequence modelling and an extensible data model for genomic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peter Wei-Der

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS's do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the Extensible Object Model'', to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less

  16. Sequence modelling and an extensible data model for genomic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peter Wei-Der

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS`s do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the ``Extensible Object Model``, to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less

  17. Information-Theoretic Approach May Shed a Light to a Better Understanding and Sustaining the Integrity of Ecological-Societal Systems under Changing Climate

    NASA Astrophysics Data System (ADS)

    Kim, J.

    2016-12-01

    Considering high levels of uncertainty, epistemological conflicts over facts and values, and a sense of urgency, normal paradigm-driven science will be insufficient to mobilize people and nation toward sustainability. The conceptual framework to bridge the societal system dynamics with that of natural ecosystems in which humanity operates remains deficient. The key to understanding their coevolution is to understand `self-organization.' Information-theoretic approach may shed a light to provide a potential framework which enables not only to bridge human and nature but also to generate useful knowledge for understanding and sustaining the integrity of ecological-societal systems. How can information theory help understand the interface between ecological systems and social systems? How to delineate self-organizing processes and ensure them to fulfil sustainability? How to evaluate the flow of information from data through models to decision-makers? These are the core questions posed by sustainability science in which visioneering (i.e., the engineering of vision) is an essential framework. Yet, visioneering has neither quantitative measure nor information theoretic framework to work with and teach. This presentation is an attempt to accommodate the framework of self-organizing hierarchical open systems with visioneering into a common information-theoretic framework. A case study is presented with the UN/FAO's communal vision of climate-smart agriculture (CSA) which pursues a trilemma of efficiency, mitigation, and resilience. Challenges of delineating and facilitating self-organizing systems are discussed using transdisciplinary toold such as complex systems thinking, dynamic process network analysis and multi-agent systems modeling. Acknowledgments: This study was supported by the Korea Meteorological Administration Research and Development Program under Grant KMA-2012-0001-A (WISE project).

  18. DAM_Explorer: A Modeling Framework for Assessing the Physical Response of Streams to Dam Removal

    DTIC Science & Technology

    2009-06-01

    ERDC TN-EMRRP-SR-65 1 DAM_Explorer: A Modeling Framework for Assessing the Physical Response of Streams to Dam Removal By Jock Conyngham1 and...physical and temporal increments for staged removals) is anticipated. POINTS OF CONTACT For additional information contact Jock N. Conyngham (406-541

  19. Design and Implementation of an Architectural Framework for Web Portals in a Ubiquitous Pervasive Environment

    PubMed Central

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal’s gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world’s largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework. PMID:22346693

  20. Design and implementation of an architectural framework for web portals in a ubiquitous pervasive environment.

    PubMed

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.

  1. Clinical Knowledge Governance Framework for Nationwide Data Infrastructure Projects.

    PubMed

    Wulff, Antje; Haarbrandt, Birger; Marschollek, Michael

    2018-01-01

    The availability of semantically-enriched and interoperable clinical information models is crucial for reusing once collected data across institutions like aspired in the German HiGHmed project. Funded by the Federal Ministry of Education and Research, this nationwide data infrastructure project adopts the openEHR approach for semantic modelling. Here, strong governance is required to define high-quality and reusable models. Design of a clinical knowledge governance framework for openEHR modelling in cross-institutional settings like HiGHmed. Analysis of successful practices from international projects, published ideas on archetype governance and own modelling experiences as well as modelling of BPMN processes. We designed a framework by presenting archetype variations, roles and responsibilities, IT support and modelling workflows. Our framework has great potential to make the openEHR modelling efforts manageable. Because practical experiences are rare, prospectively our work will be predestinated to evaluate the benefits of such structured governance approaches.

  2. Stochastic Radiative Transfer Model for Contaminated Rough Surfaces: A Framework for Detection System Design

    DTIC Science & Technology

    2013-11-01

    STOCHASTIC RADIATIVE TRANSFER MODEL FOR CONTAMINATED ROUGH SURFACES: A...of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid ...COVERED (From - To) Jan 2013 - Sep 2013 4. TITLE AND SUBTITLE Stochastic Radiative Transfer Model for Contaminated Rough Surfaces: A Framework for

  3. The Development of a Contextual Information Framework Model as a Potential IAEA Strategy to Maintain Radioactive Waste Knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upshall, I.R.; McCarthy, G.J.

    A contextual framework comprises 'entities' that exhibit one or more definable relationships with a particular 'event'. People, organisations, concepts, ideas, places, natural phenomena, events themselves, cultural artefacts including records, books, works of art can all be conceptualised as entities. If these entities are registered in an information management system where the relationships between them can be defined and systematically managed then it is possible to create a contextual information framework that represents a particular view of what occurs in real life. The careful identifying and mapping of the relationships between these entities and the selected event can lead rapidly tomore » the creation of an information network that closely reflects the human approach to knowledge acquisition and application. The 'event' referred to in this paper is the safe management of radioactive waste. It is widely accepted that society will expect that knowledge about the waste will be maintained for many decades, if not centuries. Delivering on this expectation will demand the application of management approaches that are both innovative and sustainable. Effective inter-generational transfer of information using many 'conventional' techniques will be highly dependent on societal stability - something that cannot be guaranteed over such long periods of time. Consequently, alternative approaches should be explored and, where appropriate, implemented to give reasonable assurance that future generations of waste custodians will not be unduly burdened by the need to recreate information about the waste long after its disposal. In actual fact, the contextual information framework model is not 'new technology' but simply a means for rationalising and representing the way humans naturally tend to use information in the pursuit of knowledge enhancement. By making use of multiple information entities and their relationships, it is often possible to convert otherwise impossibly complex socio-technical environments into information architectures or networks with remarkable and useful properties. The International Atomic Energy Agency, in its ongoing work to encourage the application of systems to manage radioactive waste information over the long term, has embraced the contextual information framework as a potentially viable approach to this particular challenge. To this end, it invited Member States to contribute to the production of a Safety Report that used the contextual information framework model, building on the wealth of existing IAEA guidance. The report focuses, not on the important area of records management, but on the benefits that can arise from the development of an information management approach that increases the likelihood that future generations will recognise the significance and value of the information contained in these records. Our understanding of 'inter-generational transfer' should extend beyond the simple physical transfer of records into an archival repository towards the establishment of a working culture that places sufficient contemporary information into a form that ensures it remains accessible, and ultimately enhances, the knowledge of future generations. Making information accessible is therefore the key and whilst the use of stable records media, storage environments and quality assurance are important elements, they cannot be considered solutions in themselves. This paper articulates some of the lessons that have been learned about using the contextual information framework model when applied to the long term management of radioactive waste. The draft IAEA Safety Report entitled 'Preservation and Transfer to Future Generations of Information Important to the Safety of Waste Disposal Facilities', on which this paper is based, is expected to be published in 2007. (authors)« less

  4. Research recruitment: A marketing framework to improve sample representativeness in health research.

    PubMed

    Howcutt, Sarah J; Barnett, Anna L; Barbosa-Boucas, Sofia; Smith, Lesley A

    2018-04-01

    This discussion paper proposes a five-part theoretical framework to inform recruitment strategies. The framework is based on a marketing model of consumer decision-making. Respondents in surveys are typically healthier than non-respondents, which has an impact on the availability of information about those most in need. Previous research has identified response patterns, provided theories about why people participate in research and evaluated different recruitment strategies. Social marketing has been applied successfully to recruitment and promotes focus on the needs of the participant, but little attention has been paid to the periods before and after participant-researcher contact (during advertising and following completion of studies). We propose a new model which conceptualises participation as a decision involving motivation, perception of information, attitude formation, integration of intention and action and finally evaluation and sharing of experience. Discussion paper. This discussion paper presents a critical review. No literature was excluded on date and the included citations span the years 1981-2017. The proposed framework suggests that researchers could engage a broader demographic if they shape research design and advertising to perform functions that participants are seeking to achieve. The framework provides a novel and useful conceptualisation of recruitment which could help to inform public engagement in research design, researcher training and research policy. This framework challenges researchers to investigate the goals of the potential participants when designing a study's advertising and procedures. © 2017 John Wiley & Sons Ltd.

  5. Measures and Metrics of Information Processing in Complex Systems: A Rope of Sand

    ERIC Educational Resources Information Center

    James, Ryan Gregory

    2013-01-01

    How much information do natural systems store and process? In this work we attempt to answer this question in multiple ways. We first establish a mathematical framework where natural systems are represented by a canonical form of edge-labeled hidden fc models called e-machines. Then, utilizing this framework, a variety of measures are defined and…

  6. A Theoretically Grounded Framework for Integrating the Scholarship of Teaching and Learning

    ERIC Educational Resources Information Center

    Walls, Jill K.

    2016-01-01

    SoTL scholars have written about the importance and utility of teaching from a guiding theoretical framework. In this paper, ecological theory and specifically Bronfenbrenner's bioecological model, is examined as a potential framework for synthesizing SoTL research findings to inform teaching and learning scholarship at the college level. A…

  7. An object-oriented software approach for a distributed human tracking motion system

    NASA Astrophysics Data System (ADS)

    Micucci, Daniela L.

    2003-06-01

    Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.

  8. The role of data fusion in predictive maintenance using digital twin

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Meyendorf, Norbert; Mrad, Nezih

    2018-04-01

    Modern aerospace industry is migrating from reactive to proactive and predictive maintenance to increase platform operational availability and efficiency, extend its useful life cycle and reduce its life cycle cost. Multiphysics modeling together with data-driven analytics generate a new paradigm called "Digital Twin." The digital twin is actually a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information, and can forecast the future of the corresponding physical counterpart. This paper reviews the overall framework to develop a digital twin coupled with the industrial Internet of Things technology to advance aerospace platforms autonomy. Data fusion techniques particularly play a significant role in the digital twin framework. The flow of information from raw data to high-level decision making is propelled by sensor-to-sensor, sensor-to-model, and model-to-model fusion. This paper further discusses and identifies the role of data fusion in the digital twin framework for aircraft predictive maintenance.

  9. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    PubMed

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.

  10. Examining the functionality of the DeLone and McLean information system success model as a framework for synthesis in nursing information and communication technology research.

    PubMed

    Booth, Richard G

    2012-06-01

    In this review, studies examining information and communication technology used by nurses in clinical practice were examined. Overall, a total of 39 studies were assessed spanning a time period from 1995 to 2008. The impacts of the various health information and communication technology evaluated by individual studies were synthesized using the DeLone and McLean's six-dimensional framework for evaluating information systems success (ie, System Quality, Information Quality, Service Quality, Use, User Satisfaction, and Net Benefits). Overall, the majority of researchers reported results related to the overall Net Benefits (positive, negative, and indifferent) of the health information and communication technology used by nurses. Attitudes and user satisfaction with technology were also commonly measured attributes. The current iteration of DeLone and McLean model is effective at synthesizing basic elements of health information and communication technology use by nurses. Regardless, the current model lacks the sociotechnical sensitivity to capture deeper nurse-technology relationalities. Limitations and recommendations are provided for researchers considering using the DeLone and McLean model for evaluating health information and communication technology used by nurses.

  11. A Framework for Developing the Structure of Public Health Economic Models.

    PubMed

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  12. Development of an optimised key worker framework for people with dementia, their family and caring unit living in the community.

    PubMed

    Renehan, Emma; Goeman, Dianne; Koch, Susan

    2017-07-20

    In Australia, dementia is a national health priority. With the rising number of people living with dementia and shortage of formal and informal carers predicted in the near future, developing approaches to coordinating services in quality-focused ways is considered an urgent priority. Key worker support models are one approach that have been used to assist people living with dementia and their caring unit coordinate services and navigate service systems; however, there is limited literature outlining comprehensive frameworks for the implementation of community dementia key worker roles in practice. In this paper an optimised key worker framework for people with dementia, their family and caring unit living in the community is developed and presented. A number of processes were undertaken to inform the development of a co-designed optimised key worker framework: an expert working and reference group; a systematic review of the literature; and a qualitative evaluation of 14 dementia key worker models operating in Australia involving 14 interviews with organisation managers, 19 with key workers and 15 with people living with dementia and/or their caring unit. Data from the systematic review and evaluation of dementia key worker models were analysed by the researchers and the expert working and reference group using a constant comparative approach to define the essential components of the optimised framework. The developed framework consisted of four main components: overarching philosophies; organisational context; role definition; and key worker competencies. A number of more clearly defined sub-themes sat under each component. Reflected in the framework is the complexity of the dementia journey and the difficulty in trying to develop a 'one size fits all' approach. This co-designed study led to the development of an evidence based framework which outlines a comprehensive synthesis of components viewed as being essential to the implementation of a dementia key worker model of care in the community. The framework was informed and endorsed by people living with dementia and their caring unit, key workers, managers, Australian industry experts, policy makers and researchers. An evaluation of its effectiveness and relevance for practice within the dementia care space is required.

  13. Critical social theory as a model for the informatics curriculum for nursing.

    PubMed

    Wainwright, P; Jones, P G

    2000-01-01

    It is widely acknowledged that the education and training of nurses in information management and technology is problematic. Drawing from recent research this paper presents a theoretical framework within which the nature of the problems faced by nurses in the use of information may be analyzed. This framework, based on the critical social theory of Habermas, also provides a model for the informatics curriculum. The advantages of problem based learning and multi-media web-based technologies for the delivery of learning materials within this area are also discussed.

  14. Planning and Implementation Framework for a Hybrid E-Learning Model: The Context of a Part-Time LIS Postgraduate Programme

    ERIC Educational Resources Information Center

    Huang, Leelien Ken

    2010-01-01

    E-learning and traditional classroom learning have been combined to deliver library and information science (LIS) education. However, the framework for planning and implementing a hybrid e-learning model is unclear in the literature. Using a routines-based perspective, e-learning opportunities were explored through identifying the internal…

  15. Information Interaction: Providing a Framework for Information Architecture.

    ERIC Educational Resources Information Center

    Toms, Elaine G.

    2002-01-01

    Discussion of information architecture focuses on a model of information interaction that bridges the gap between human and computer and between information behavior and information retrieval. Illustrates how the process of information interaction is affected by the user, the system, and the content. (Contains 93 references.) (LRW)

  16. Collaborative Information Retrieval Method among Personal Repositories

    NASA Astrophysics Data System (ADS)

    Kamei, Koji; Yukawa, Takashi; Yoshida, Sen; Kuwabara, Kazuhiro

    In this paper, we describe a collaborative information retrieval method among personal repositorie and an implementation of the method on a personal agent framework. We propose a framework for personal agents that aims to enable the sharing and exchange of information resources that are distributed unevenly among individuals. The kernel of a personal agent framework is an RDF(resource description framework)-based information repository for storing, retrieving and manipulating privately collected information, such as documents the user read and/or wrote, email he/she exchanged, web pages he/she browsed, etc. The repository also collects annotations to information resources that describe relationships among information resources and records of interaction between the user and information resources. Since the information resources in a personal repository and their structure are personalized, information retrieval from other users' is an important application of the personal agent. A vector space model with a personalized concept-base is employed as an information retrieval mechanism in a personal repository. Since a personalized concept-base is constructed from information resources in a personal repository, it reflects its user's knowledge and interests. On the other hand, it leads to another problem while querying other users' personal repositories; that is, simply transferring query requests does not provide desirable results. To solve this problem, we propose a query equalization scheme based on a relevance feedback method for collaborative information retrieval between personalized concept-bases. In this paper, we describe an implementation of the collaborative information retrieval method and its user interface on the personal agent framework.

  17. ImTK: an open source multi-center information management toolkit

    NASA Astrophysics Data System (ADS)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  18. Framework for Modeling High-Impact, Low-Frequency Power Grid Events to Support Risk-Informed Decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.

    2015-12-03

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insightsmore » from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of grid modernization activities. The availability of a grid HILF risk model, integrated across multi-hazard domains which, when interrogated, can support transparent, defensible and effective decisions, is an attractive prospect among these communities. In this report, we document an integrated HILF risk framework intended to inform the development of risk models. These models would be based on the systematic and comprehensive (to within scope) characterization of hazards to the level of detail required for modeling risk, identification of the stressors associated with the hazards (i.e., the means of impacting grid and supporting infrastructure), characterization of the vulnerability of assets to these stressors and the probabilities of asset compromise, the grid’s dynamic response to the asset failures, and assessment of subsequent severities of consequence with respect to selected impact metrics, such as power outage duration and geographic reach. Specifically, the current framework is being developed to;1. Provide the conceptual and overarching technical paradigms for the development of risk models; 2. Identify the classes of models required to implement the framework - providing examples of existing models, and also identifying where modeling gaps exist; 3. Identify the types of data required, addressing circumstances under which data are sparse and the formal elicitation of informed judgment might be required; and 4. Identify means by which the resultant risk models might be interrogated to form the necessary basis for risk management.« less

  19. Comparing Information Access Approaches.

    ERIC Educational Resources Information Center

    Chalmers, Matthew

    1999-01-01

    Presents a broad view of information access, drawing from philosophy and semiology in constructing a framework for comparative discussion that is used to examine the information representations that underlie four approaches to information access--information retrieval, workflow, collaborative filtering, and the path model. Contains 32 references.…

  20. Using SCOR as a Supply Chain Management Framework for Government Agency Contract Requirements

    NASA Technical Reports Server (NTRS)

    Paxton, Joseph; Tucker, Brian

    2010-01-01

    This paper will present a model that uses the Supply-Chain Operations Reference (SCOR) model as a foundation for a framework to illustrate the information needed throughout a product lifecycle to support a healthy supply chain management function and the subsequent contract requirements to enable it. It will also show where in the supply chain the information must be extracted. The ongoing case study used to exemplify the model is NASA's (National Aeronautics and Space Administration) Ares I program for human spaceflight. Effective supply chain management and contract requirements are ongoing opportunities for continuous improvement within government agencies, specifically development of systems for human spaceflight operations. Multiple reports from the Government Accountability Office (GAO) reinforce this importance. The SCOR model is a framework for describing a supply chain with process building blocks and business activities. It provides a set of metrics for measuring supply chain performance and best practices for continuously improving. This paper expands the application of the SCOR to also provide the framework for defining information needed from different levels of the supply chain and at different phases of the lifecycle. These needs can be incorporated into contracts to enable more effective supply chain management. Depending on the phase of the lifecycle, effective supply chain management will require involvement from different levels of the organization and different levels of the supply chain.

  1. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings.

    PubMed

    Dreibelbis, Robert; Winch, Peter J; Leontsini, Elli; Hulland, Kristyna R S; Ram, Pavani K; Unicomb, Leanne; Luby, Stephen P

    2013-10-26

    Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). A number of WASH-specific models and frameworks exist, yet with some limitations. The IBM-WASH model aims to provide both a conceptual and practical tool for improving our understanding and evaluation of the multi-level multi-dimensional factors that influence water, sanitation, and hygiene practices in infrastructure-constrained settings. We outline future applications of our proposed model as well as future research priorities needed to advance our understanding of the sustained adoption of water, sanitation, and hygiene technologies and practices.

  2. The Integrated Behavioural Model for Water, Sanitation, and Hygiene: a systematic review of behavioural models and a framework for designing and evaluating behaviour change interventions in infrastructure-restricted settings

    PubMed Central

    2013-01-01

    Background Promotion and provision of low-cost technologies that enable improved water, sanitation, and hygiene (WASH) practices are seen as viable solutions for reducing high rates of morbidity and mortality due to enteric illnesses in low-income countries. A number of theoretical models, explanatory frameworks, and decision-making models have emerged which attempt to guide behaviour change interventions related to WASH. The design and evaluation of such interventions would benefit from a synthesis of this body of theory informing WASH behaviour change and maintenance. Methods We completed a systematic review of existing models and frameworks through a search of related articles available in PubMed and in the grey literature. Information on the organization of behavioural determinants was extracted from the references that fulfilled the selection criteria and synthesized. Results from this synthesis were combined with other relevant literature, and from feedback through concurrent formative and pilot research conducted in the context of two cluster-randomized trials on the efficacy of WASH behaviour change interventions to inform the development of a framework to guide the development and evaluation of WASH interventions: the Integrated Behavioural Model for Water, Sanitation, and Hygiene (IBM-WASH). Results We identified 15 WASH-specific theoretical models, behaviour change frameworks, or programmatic models, of which 9 addressed our review questions. Existing models under-represented the potential role of technology in influencing behavioural outcomes, focused on individual-level behavioural determinants, and had largely ignored the role of the physical and natural environment. IBM-WASH attempts to correct this by acknowledging three dimensions (Contextual Factors, Psychosocial Factors, and Technology Factors) that operate on five-levels (structural, community, household, individual, and habitual). Conclusions A number of WASH-specific models and frameworks exist, yet with some limitations. The IBM-WASH model aims to provide both a conceptual and practical tool for improving our understanding and evaluation of the multi-level multi-dimensional factors that influence water, sanitation, and hygiene practices in infrastructure-constrained settings. We outline future applications of our proposed model as well as future research priorities needed to advance our understanding of the sustained adoption of water, sanitation, and hygiene technologies and practices. PMID:24160869

  3. Object-oriented models of cognitive processing.

    PubMed

    Mather, G

    2001-05-01

    Information-processing models of vision and cognition are inspired by procedural programming languages. Models that emphasize object-based representations are closely related to object-oriented programming languages. The concepts underlying object-oriented languages provide a theoretical framework for cognitive processing that differs markedly from that offered by procedural languages. This framework is well-suited to a system designed to deal flexibly with discrete objects and unpredictable events in the world.

  4. Building a diabetes screening population data repository using electronic medical records.

    PubMed

    Tuan, Wen-Jan; Sheehy, Ann M; Smith, Maureen A

    2011-05-01

    There has been a rapid advancement of information technology in the area of clinical and population health data management since 2000. However, with the fast growth of electronic medical records (EMRs) and the increasing complexity of information systems, it has become challenging for researchers to effectively access, locate, extract, and analyze information critical to their research. This article introduces an outpatient encounter data framework designed to construct an EMR-based population data repository for diabetes screening research. The outpatient encounter data framework is developed on a hybrid data structure of entity-attribute-value models, dimensional models, and relational models. This design preserves a small number of subject-specific tables essential to key clinical constructs in the data repository. It enables atomic information to be maintained in a transparent and meaningful way to researchers and health care practitioners who need to access data and still achieve the same performance level as conventional data warehouse models. A six-layer information processing strategy is developed to extract and transform EMRs to the research data repository. The data structure also complies with both Health Insurance Portability and Accountability Act regulations and the institutional review board's requirements. Although developed for diabetes screening research, the design of the outpatient encounter data framework is suitable for other types of health service research. It may also provide organizations a tool to improve health care quality and efficiency, consistent with the "meaningful use" objectives of the Health Information Technology for Economic and Clinical Health Act. © 2011 Diabetes Technology Society.

  5. The experiential health information processing model: supporting collaborative web-based patient education

    PubMed Central

    O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine

    2008-01-01

    Background First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. Results In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. Conclusion An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided. PMID:19087353

  6. The experiential health information processing model: supporting collaborative web-based patient education.

    PubMed

    O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine

    2008-12-16

    First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided.

  7. Framework of distributed coupled atmosphere-ocean-wave modeling system

    NASA Astrophysics Data System (ADS)

    Wen, Yuanqiao; Huang, Liwen; Deng, Jian; Zhang, Jinfeng; Wang, Sisi; Wang, Lijun

    2006-05-01

    In order to research the interactions between the atmosphere and ocean as well as their important role in the intensive weather systems of coastal areas, and to improve the forecasting ability of the hazardous weather processes of coastal areas, a coupled atmosphere-ocean-wave modeling system has been developed. The agent-based environment framework for linking models allows flexible and dynamic information exchange between models. For the purpose of flexibility, portability and scalability, the framework of the whole system takes a multi-layer architecture that includes a user interface layer, computational layer and service-enabling layer. The numerical experiment presented in this paper demonstrates the performance of the distributed coupled modeling system.

  8. Rate-distortion theory and human perception.

    PubMed

    Sims, Chris R

    2016-07-01

    The fundamental goal of perception is to aid in the achievement of behavioral objectives. This requires extracting and communicating useful information from noisy and uncertain sensory signals. At the same time, given the complexity of sensory information and the limitations of biological information processing, it is necessary that some information must be lost or discarded in the act of perception. Under these circumstances, what constitutes an 'optimal' perceptual system? This paper describes the mathematical framework of rate-distortion theory as the optimal solution to the problem of minimizing the costs of perceptual error subject to strong constraints on the ability to communicate or transmit information. Rate-distortion theory offers a general and principled theoretical framework for developing computational-level models of human perception (Marr, 1982). Models developed in this framework are capable of producing quantitatively precise explanations for human perceptual performance, while yielding new insights regarding the nature and goals of perception. This paper demonstrates the application of rate-distortion theory to two benchmark domains where capacity limits are especially salient in human perception: discrete categorization of stimuli (also known as absolute identification) and visual working memory. A software package written for the R statistical programming language is described that aids in the development of models based on rate-distortion theory. Copyright © 2016 The Author. Published by Elsevier B.V. All rights reserved.

  9. Understanding consumer health information-seeking behavior from the perspective of the risk perception attitude framework and social support in mobile social media websites.

    PubMed

    Deng, Zhaohua; Liu, Shan

    2017-09-01

    This study integrates the risk perception attitude framework and social support to examine factors influencing consumers' intentions to seek health information in mobile social media websites. We develop a research model consisting of four social support dimensions, perceived health risk, health self-efficacy, and health information-seeking intention. A survey is conducted among patients with non-serious conditions. A two-step approach of structural equation modeling is used to test the research model. Among the four dimensions of social support, tangible support and appraisal support significantly influence perceived risk, whereas emotional support and esteem support significantly influence health self-efficacy. Perceived health risk and health self-efficacy significantly influence the health information-seeking behavior intention of consumers. Specifically, health self-efficacy significantly moderates the relationship between perceived risk and behavior intention. This study highlights the integrated effects of social capital and risk perception attitude framework on health information-seeking intention. It examines relationships among perceived health risk, health self-efficacy, and behavior intention in the mobile social media context. The findings help understand effects of social capital factors on perceived health risk and health self-efficacy. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Information spreading in Delay Tolerant Networks based on nodes' behaviors

    NASA Astrophysics Data System (ADS)

    Wu, Yahui; Deng, Su; Huang, Hongbin

    2014-07-01

    Information spreading in DTNs (Delay Tolerant Networks) adopts a store-carry-forward method, and nodes receive the message from others directly. However, it is hard to judge whether the information is safe in this communication mode. In this case, a node may observe other nodes' behaviors. At present, there is no theoretical model to describe the varying rule of the nodes' trusting level. In addition, due to the uncertainty of the connectivity in DTN, a node is hard to get the global state of the network. Therefore, a rational model about the node's trusting level should be a function of the node's own observing result. For example, if a node finds k nodes carrying a message, it may trust the information with probability p(k). This paper does not explore the real distribution of p(k), but instead presents a unifying theoretical framework to evaluate the performance of the information spreading in above case. This framework is an extension of the traditional SI (susceptible-infected) model, and is useful when p(k) conforms to any distribution. Simulations based on both synthetic and real motion traces show the accuracy of the framework. Finally, we explore the impact of the nodes' behaviors based on certain special distributions through numerical results.

  11. Working Group 1: Software System Design and Implementation for Environmental Modeling

    EPA Science Inventory

    ISCMEM Working Group One Presentation, presentation with the purpose of fostering the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases.

  12. A Bayesian Uncertainty Framework for Conceptual Snowmelt and Hydrologic Models Applied to the Tenderfoot Creek Experimental Forest

    NASA Astrophysics Data System (ADS)

    Smith, T.; Marshall, L.

    2007-12-01

    In many mountainous regions, the single most important parameter in forecasting the controls on regional water resources is snowpack (Williams et al., 1999). In an effort to bridge the gap between theoretical understanding and functional modeling of snow-driven watersheds, a flexible hydrologic modeling framework is being developed. The aim is to create a suite of models that move from parsimonious structures, concentrated on aggregated watershed response, to those focused on representing finer scale processes and distributed response. This framework will operate as a tool to investigate the link between hydrologic model predictive performance, uncertainty, model complexity, and observable hydrologic processes. Bayesian methods, and particularly Markov chain Monte Carlo (MCMC) techniques, are extremely useful in uncertainty assessment and parameter estimation of hydrologic models. However, these methods have some difficulties in implementation. In a traditional Bayesian setting, it can be difficult to reconcile multiple data types, particularly those offering different spatial and temporal coverage, depending on the model type. These difficulties are also exacerbated by sensitivity of MCMC algorithms to model initialization and complex parameter interdependencies. As a way of circumnavigating some of the computational complications, adaptive MCMC algorithms have been developed to take advantage of the information gained from each successive iteration. Two adaptive algorithms are compared is this study, the Adaptive Metropolis (AM) algorithm, developed by Haario et al (2001), and the Delayed Rejection Adaptive Metropolis (DRAM) algorithm, developed by Haario et al (2006). While neither algorithm is truly Markovian, it has been proven that each satisfies the desired ergodicity and stationarity properties of Markov chains. Both algorithms were implemented as the uncertainty and parameter estimation framework for a conceptual rainfall-runoff model based on the Probability Distributed Model (PDM), developed by Moore (1985). We implement the modeling framework in Stringer Creek watershed in the Tenderfoot Creek Experimental Forest (TCEF), Montana. The snowmelt-driven watershed offers that additional challenge of modeling snow accumulation and melt and current efforts are aimed at developing a temperature- and radiation-index snowmelt model. Auxiliary data available from within TCEF's watersheds are used to support in the understanding of information value as it relates to predictive performance. Because the model is based on lumped parameters, auxiliary data are hard to incorporate directly. However, these additional data offer benefits through the ability to inform prior distributions of the lumped, model parameters. By incorporating data offering different information into the uncertainty assessment process, a cross-validation technique is engaged to better ensure that modeled results reflect real process complexity.

  13. Modeling patient safety incidents knowledge with the Categorial Structure method.

    PubMed

    Souvignet, Julien; Bousquet, Cédric; Lewalle, Pierre; Trombert-Paviot, Béatrice; Rodrigues, Jean Marie

    2011-01-01

    Following the WHO initiative named World Alliance for Patient Safety (PS) launched in 2004 a conceptual framework developed by PS national reporting experts has summarized the knowledge available. As a second step, the Department of Public Health of the University of Saint Etienne team elaborated a Categorial Structure (a semi formal structure not related to an upper level ontology) identifying the elements of the semantic structure underpinning the broad concepts contained in the framework for patient safety. This knowledge engineering method has been developed to enable modeling patient safety information as a prerequisite for subsequent full ontology development. The present article describes the semantic dissection of the concepts, the elicitation of the ontology requirements and the domain constraints of the conceptual framework. This ontology includes 134 concepts and 25 distinct relations and will serve as basis for an Information Model for Patient Safety.

  14. Authentic scientific data collection in support of an integrative model-based class: A framework for student engagement in the classroom

    NASA Astrophysics Data System (ADS)

    Sorensen, A. E.; Dauer, J. M.; Corral, L.; Fontaine, J. J.

    2017-12-01

    A core component of public scientific literacy, and thereby informed decision-making, is the ability of individuals to reason about complex systems. In response to students having difficulty learning about complex systems, educational research suggests that conceptual representations, or mental models, may help orient student thinking. Mental models provide a framework to support students in organizing and developing ideas. The PMC-2E model is a productive tool in teaching ideas of modeling complex systems in the classroom because the conceptual representation framework allows for self-directed learning where students can externalize systems thinking. Beyond mental models, recent work emphasizes the importance of facilitating integration of authentic science into the formal classroom. To align these ideas, a university class was developed around the theme of carnivore ecology, founded on PMC-2E framework and authentic scientific data collection. Students were asked to develop a protocol, collect, and analyze data around a scientific question in partnership with a scientist, and then use data to inform their own learning about the system through the mental model process. We identified two beneficial outcomes (1) scientific data is collected to address real scientific questions at a larger scale and (2) positive outcomes for student learning and views of science. After participating in the class, students report enjoying class structure, increased support for public understanding of science, and shifts in nature of science and interest in pursuing science metrics on post-assessments. Further work is ongoing investigating the linkages between engaging in authentic scientific practices that inform student mental models, and how it might promote students' systems-thinking skills, implications for student views of nature of science, and development of student epistemic practices.

  15. A Biophysical Modeling Framework for Assessing the Environmental Impact of Biofuel Production

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Izaurradle, C.; Manowitz, D.; West, T. O.; Post, W. M.; Thomson, A. M.; Nichols, J.; Bandaru, V.; Williams, J. R.

    2009-12-01

    Long-term sustainability of a biofuel economy necessitates environmentally friendly biofuel production systems. We describe a biophysical modeling framework developed to understand and quantify the environmental value and impact (e.g. water balance, nutrients balance, carbon balance, and soil quality) of different biomass cropping systems. This modeling framework consists of three major components: 1) a Geographic Information System (GIS) based data processing system, 2) a spatially-explicit biophysical modeling approach, and 3) a user friendly information distribution system. First, we developed a GIS to manage the large amount of geospatial data (e.g. climate, land use, soil, and hydrograhy) and extract input information for the biophysical model. Second, the Environmental Policy Integrated Climate (EPIC) biophysical model is used to predict the impact of various cropping systems and management intensities on productivity, water balance, and biogeochemical variables. Finally, a geo-database is developed to distribute the results of ecosystem service variables (e.g. net primary productivity, soil carbon balance, soil erosion, nitrogen and phosphorus losses, and N2O fluxes) simulated by EPIC for each spatial modeling unit online using PostgreSQL. We applied this framework in a Regional Intensive Management Area (RIMA) of 9 counties in Michigan. A total of 4,833 spatial units with relatively homogeneous biophysical properties were derived using SSURGO, Crop Data Layer, County, and 10-digit watershed boundaries. For each unit, EPIC was executed from 1980 to 2003 under 54 cropping scenarios (eg. corn, switchgrass, and hybrid poplar). The simulation results were compared with historical crop yields from USDA NASS. Spatial mapping of the results show high variability among different cropping scenarios in terms of the simulated ecosystem services variables. Overall, the framework developed in this study enables the incorporation of environmental factors into economic and life-cycle analysis in order to optimize biomass cropping production scenarios.

  16. NoSQL Based 3D City Model Management System

    NASA Astrophysics Data System (ADS)

    Mao, B.; Harrie, L.; Cao, J.; Wu, Z.; Shen, J.

    2014-04-01

    To manage increasingly complicated 3D city models, a framework based on NoSQL database is proposed in this paper. The framework supports import and export of 3D city model according to international standards such as CityGML, KML/COLLADA and X3D. We also suggest and implement 3D model analysis and visualization in the framework. For city model analysis, 3D geometry data and semantic information (such as name, height, area, price and so on) are stored and processed separately. We use a Map-Reduce method to deal with the 3D geometry data since it is more complex, while the semantic analysis is mainly based on database query operation. For visualization, a multiple 3D city representation structure CityTree is implemented within the framework to support dynamic LODs based on user viewpoint. Also, the proposed framework is easily extensible and supports geoindexes to speed up the querying. Our experimental results show that the proposed 3D city management system can efficiently fulfil the analysis and visualization requirements.

  17. Semantic Image Segmentation with Contextual Hierarchical Models.

    PubMed

    Seyedhosseini, Mojtaba; Tasdizen, Tolga

    2016-05-01

    Semantic segmentation is the problem of assigning an object label to each pixel. It unifies the image segmentation and object recognition problems. The importance of using contextual information in semantic segmentation frameworks has been widely realized in the field. We propose a contextual framework, called contextual hierarchical model (CHM), which learns contextual information in a hierarchical framework for semantic segmentation. At each level of the hierarchy, a classifier is trained based on downsampled input images and outputs of previous levels. Our model then incorporates the resulting multi-resolution contextual information into a classifier to segment the input image at original resolution. This training strategy allows for optimization of a joint posterior probability at multiple resolutions through the hierarchy. Contextual hierarchical model is purely based on the input image patches and does not make use of any fragments or shape examples. Hence, it is applicable to a variety of problems such as object segmentation and edge detection. We demonstrate that CHM performs at par with state-of-the-art on Stanford background and Weizmann horse datasets. It also outperforms state-of-the-art edge detection methods on NYU depth dataset and achieves state-of-the-art on Berkeley segmentation dataset (BSDS 500).

  18. A Framework for Context Sensitive Risk-Based Access Control in Medical Information Systems

    PubMed Central

    Choi, Donghee; Kim, Dohoon; Park, Seog

    2015-01-01

    Since the access control environment has changed and the threat of insider information leakage has come to the fore, studies on risk-based access control models that decide access permissions dynamically have been conducted vigorously. Medical information systems should protect sensitive data such as medical information from insider threat and enable dynamic access control depending on the context such as life-threatening emergencies. In this paper, we suggest an approach and framework for context sensitive risk-based access control suitable for medical information systems. This approach categorizes context information, estimating and applying risk through context- and treatment-based permission profiling and specifications by expanding the eXtensible Access Control Markup Language (XACML) to apply risk. The proposed framework supports quick responses to medical situations and prevents unnecessary insider data access through dynamic access authorization decisions in accordance with the severity of the context and treatment. PMID:26075013

  19. Strengthening stakeholder-engaged research and research on stakeholder engagement.

    PubMed

    Ray, Kristin N; Miller, Elizabeth

    2017-06-01

    Stakeholder engagement is an emerging field with little evidence to inform best practices. Guidelines are needed to improve the quality of research on stakeholder engagement through more intentional planning, evaluation and reporting. We developed a preliminary framework for planning, evaluating and reporting stakeholder engagement, informed by published conceptual models and recommendations and then refined through our own stakeholder engagement experience. Our proposed exploratory framework highlights contexts and processes to be addressed in planning stakeholder engagement, and potential immediate, intermediate and long-term outcomes that warrant evaluation. We use this framework to illustrate both the minimum information needed for reporting stakeholder-engaged research and the comprehensive detail needed for reporting research on stakeholder engagement.

  20. Strengthening stakeholder-engaged research and research on stakeholder engagement

    PubMed Central

    Ray, Kristin N; Miller, Elizabeth

    2017-01-01

    Stakeholder engagement is an emerging field with little evidence to inform best practices. Guidelines are needed to improve the quality of research on stakeholder engagement through more intentional planning, evaluation and reporting. We developed a preliminary framework for planning, evaluating and reporting stakeholder engagement, informed by published conceptual models and recommendations and then refined through our own stakeholder engagement experience. Our proposed exploratory framework highlights contexts and processes to be addressed in planning stakeholder engagement, and potential immediate, intermediate and long-term outcomes that warrant evaluation. We use this framework to illustrate both the minimum information needed for reporting stakeholder-engaged research and the comprehensive detail needed for reporting research on stakeholder engagement. PMID:28621551

  1. HOW CAN BIOLOGICALLY-BASED MODELING OF ARSENIC KINETICS AND DYNAMICS INFORM THE RISK ASSESSMENT PROCESS?

    EPA Science Inventory

    Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic met...

  2. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  3. Calibration of skill and judgment in driving: development of a conceptual framework and the implications for road safety.

    PubMed

    Horrey, William J; Lesch, Mary F; Mitsopoulos-Rubens, Eve; Lee, John D

    2015-03-01

    Humans often make inflated or erroneous estimates of their own ability or performance. Such errors in calibration can be due to incomplete processing, neglect of available information or due to improper weighing or integration of the information and can impact our decision-making, risk tolerance, and behaviors. In the driving context, these outcomes can have important implications for safety. The current paper discusses the notion of calibration in the context of self-appraisals and self-competence as well as in models of self-regulation in driving. We further develop a conceptual framework for calibration in the driving context borrowing from earlier models of momentary demand regulation, information processing, and lens models for information selection and utilization. Finally, using the model we describe the implications for calibration (or, more specifically, errors in calibration) for our understanding of driver distraction, in-vehicle automation and autonomous vehicles, and the training of novice and inexperienced drivers. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. A Method for Evaluating Information Security Governance (ISG) Components in Banking Environment

    NASA Astrophysics Data System (ADS)

    Ula, M.; Ula, M.; Fuadi, W.

    2017-02-01

    As modern banking increasingly relies on the internet and computer technologies to operate their businesses and market interactions, the threats and security breaches have highly increased in recent years. Insider and outsider attacks have caused global businesses lost trillions of Dollars a year. Therefore, that is a need for a proper framework to govern the information security in the banking system. The aim of this research is to propose and design an enhanced method to evaluate information security governance (ISG) implementation in banking environment. This research examines and compares the elements from the commonly used information security governance frameworks, standards and best practices. Their strength and weakness are considered in its approaches. The initial framework for governing the information security in banking system was constructed from document review. The framework was categorized into three levels which are Governance level, Managerial level, and technical level. The study further conducts an online survey for banking security professionals to get their professional judgment about the ISG most critical components and the importance for each ISG component that should be implemented in banking environment. Data from the survey was used to construct a mathematical model for ISG evaluation, component importance data used as weighting coefficient for the related component in the mathematical model. The research further develops a method for evaluating ISG implementation in banking based on the mathematical model. The proposed method was tested through real bank case study in an Indonesian local bank. The study evidently proves that the proposed method has sufficient coverage of ISG in banking environment and effectively evaluates the ISG implementation in banking environment.

  5. An Integrated Ensemble-Based Operational Framework to Predict Urban Flooding: A Case Study of Hurricane Sandy in the Passaic and Hackensack River Basins

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.

    2016-12-01

    Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.

  6. eClims: An Extensible and Dynamic Integration Framework for Biomedical Information Systems.

    PubMed

    Savonnet, Marinette; Leclercq, Eric; Naubourg, Pierre

    2016-11-01

    Biomedical information systems (BIS) require consideration of three types of variability: data variability induced by new high throughput technologies, schema or model variability induced by large scale studies or new fields of research, and knowledge variability resulting from new discoveries. Beyond data heterogeneity, managing variabilities in the context of BIS requires extensible and dynamic integration process. In this paper, we focus on data and schema variabilities and we propose an integration framework based on ontologies, master data, and semantic annotations. The framework addresses issues related to: 1) collaborative work through a dynamic integration process; 2) variability among studies using an annotation mechanism; and 3) quality control over data and semantic annotations. Our approach relies on two levels of knowledge: BIS-related knowledge is modeled using an application ontology coupled with UML models that allow controlling data completeness and consistency, and domain knowledge is described by a domain ontology, which ensures data coherence. A system build with the eClims framework has been implemented and evaluated in the context of a proteomic platform.

  7. A tiered, integrated biological and chemical monitoring framework for contaminants of emerging concern in aquatic ecosystems.

    PubMed

    Maruya, Keith A; Dodder, Nathan G; Mehinto, Alvine C; Denslow, Nancy D; Schlenk, Daniel; Snyder, Shane A; Weisberg, Stephen B

    2016-07-01

    The chemical-specific risk-based paradigm that informs monitoring and assessment of environmental contaminants does not apply well to the many thousands of new chemicals that are being introduced into ambient receiving waters. We propose a tiered framework that incorporates bioanalytical screening tools and diagnostic nontargeted chemical analysis to more effectively monitor for contaminants of emerging concern (CECs). The framework is based on a comprehensive battery of in vitro bioassays to first screen for a broad spectrum of CECs and nontargeted analytical methods to identify bioactive contaminants missed by the currently favored targeted analyses. Water quality managers in California have embraced this strategy with plans to further develop and test this framework in regional and statewide pilot studies on waterbodies that receive discharge from municipal wastewater treatment plants and stormwater runoff. In addition to directly informing decisions, the data obtained using this framework can be used to construct and validate models that better predict CEC occurrence and toxicity. The adaptive interplay among screening results, diagnostic assessment and predictive modeling will allow managers to make decisions based on the most current and relevant information, instead of extrapolating from parameters with questionable linkage to CEC impacts. Integr Environ Assess Manag 2016;12:540-547. © 2015 SETAC. © 2015 SETAC.

  8. "My Understanding Has Grown, My Perspective Has Switched": Linking Informal Writing to Learning Goals

    ERIC Educational Resources Information Center

    Hudd, Suzanne S.; Smart, Robert A.; Delohery, Andrew W.

    2011-01-01

    The use of informal writing is common in sociology. This article presents one model for integrating informal written work with learning goals through a theoretical framework known as concentric thinking. More commonly referred to as "the PTA model" because of the series of cognitive tasks it promotes--prioritization, translation, and analogy…

  9. Design of a component-based integrated environmental modeling framework

    EPA Science Inventory

    Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...

  10. Assessing and forecasting population health: integrating knowledge and beliefs in a comprehensive framework.

    PubMed

    Van Meijgaard, Jeroen; Fielding, Jonathan E; Kominski, Gerald F

    2009-01-01

    A comprehensive population health-forecasting model has the potential to interject new and valuable information about the future health status of the population based on current conditions, socioeconomic and demographic trends, and potential changes in policies and programs. Our Health Forecasting Model uses a continuous-time microsimulation framework to simulate individuals' lifetime histories by using birth, risk exposures, disease incidence, and death rates to mark changes in the state of the individual. The model generates a reference forecast of future health in California, including details on physical activity, obesity, coronary heart disease, all-cause mortality, and medical expenditures. We use the model to answer specific research questions, inform debate on important policy issues in public health, support community advocacy, and provide analysis on the long-term impact of proposed changes in policies and programs, thus informing stakeholders at all levels and supporting decisions that can improve the health of populations.

  11. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  12. Synergy of the Developed 6D BIM Framework and Conception of the nD BIM Framework and nD BIM Process Ontology

    ERIC Educational Resources Information Center

    O'Keeffe, Shawn Edward

    2013-01-01

    The author developed a unified nD framework and process ontology for Building Information Modeling (BIM). The research includes a framework developed for 6D BIM, nD BIM, and nD ontology that defines the domain and sub-domain constructs for future nD BIM dimensions. The nD ontology defines the relationships of kinds within any new proposed…

  13. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: A data-driven, physics-informed Bayesian approach

    NASA Astrophysics Data System (ADS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.

  14. Common and Innovative Visuals: A sparsity modeling framework for video.

    PubMed

    Abdolhosseini Moghadam, Abdolreza; Kumar, Mrityunjay; Radha, Hayder

    2014-05-02

    Efficient video representation models are critical for many video analysis and processing tasks. In this paper, we present a framework based on the concept of finding the sparsest solution to model video frames. To model the spatio-temporal information, frames from one scene are decomposed into two components: (i) a common frame, which describes the visual information common to all the frames in the scene/segment, and (ii) a set of innovative frames, which depicts the dynamic behaviour of the scene. The proposed approach exploits and builds on recent results in the field of compressed sensing to jointly estimate the common frame and the innovative frames for each video segment. We refer to the proposed modeling framework by CIV (Common and Innovative Visuals). We show how the proposed model can be utilized to find scene change boundaries and extend CIV to videos from multiple scenes. Furthermore, the proposed model is robust to noise and can be used for various video processing applications without relying on motion estimation and detection or image segmentation. Results for object tracking, video editing (object removal, inpainting) and scene change detection are presented to demonstrate the efficiency and the performance of the proposed model.

  15. Incorporating Resilience into Dynamic Social Models

    DTIC Science & Technology

    2016-07-20

    solved by simply using the information provided by the scenario. Instead, additional knowledge is required from relevant fields that study these...resilience function by leveraging Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network framework[5],[6]. BKBs allow for inferencing...reasoning network framework based on Bayesian Knowledge Bases (BKBs). BKBs are central to our social resilience framework as they are used to

  16. The Myth of the Rational Decision Maker: A Framework for Applying and Enhancing Heuristic and Intuitive Decision Making by School Leaders

    ERIC Educational Resources Information Center

    Davis, Stephen H.

    2004-01-01

    This article takes a critical look at administrative decision making in schools and the extent to which complex decisions conform to normative models and common expectations of rationality. An alternative framework for administrative decision making is presented that is informed, but not driven, by theories of rationality. The framework assumes…

  17. Information measures for terrain visualization

    NASA Astrophysics Data System (ADS)

    Bonaventura, Xavier; Sima, Aleksandra A.; Feixas, Miquel; Buckley, Simon J.; Sbert, Mateu; Howell, John A.

    2017-02-01

    Many quantitative and qualitative studies in geoscience research are based on digital elevation models (DEMs) and 3D surfaces to aid understanding of natural and anthropogenically-influenced topography. As well as their quantitative uses, the visual representation of DEMs can add valuable information for identifying and interpreting topographic features. However, choice of viewpoints and rendering styles may not always be intuitive, especially when terrain data are augmented with digital image texture. In this paper, an information-theoretic framework for object understanding is applied to terrain visualization and terrain view selection. From a visibility channel between a set of viewpoints and the component polygons of a 3D terrain model, we obtain three polygonal information measures. These measures are used to visualize the information associated with each polygon of the terrain model. In order to enhance the perception of the terrain's shape, we explore the effect of combining the calculated information measures with the supplementary digital image texture. From polygonal information, we also introduce a method to select a set of representative views of the terrain model. Finally, we evaluate the behaviour of the proposed techniques using example datasets. A publicly available framework for both the visualization and the view selection of a terrain has been created in order to provide the possibility to analyse any terrain model.

  18. How Can Biologically-Based Modeling of Arsenic Kinetics and Dynamics Inform the Risk Assessment Process? -- ETD

    EPA Science Inventory

    Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic me...

  19. SQL Collaborative Learning Framework Based on SOA

    NASA Astrophysics Data System (ADS)

    Armiati, S.; Awangga, RM

    2018-04-01

    The research is focused on designing collaborative learning-oriented framework fulfilment service in teaching SQL Oracle 10g. Framework built a foundation of academic fulfilment service performed by a layer of the working unit in collaboration with Program Studi Manajemen Informatika. In the design phase defined what form of collaboration models and information technology proposed for Program Studi Manajemen Informatika by using a framework of collaboration inspired by the stages of modelling a Service Oriented Architecture (SOA). Stages begin with analyzing subsystems, this activity is used to determine subsystem involved and reliance as well as workflow between the subsystems. After the service can be identified, the second phase is designing the component specifications, which details the components that are implemented in the service to include the data, rules, services, profiles can be configured, and variations. The third stage is to allocate service, set the service to the subsystems that have been identified, and its components. Implementation framework contributes to the teaching guides and application architecture that can be used as a landing realize an increase in service by applying information technology.

  20. Modeling socio-cultural processes in network-centric environments

    NASA Astrophysics Data System (ADS)

    Santos, Eunice E.; Santos, Eugene, Jr.; Korah, John; George, Riya; Gu, Qi; Kim, Keumjoo; Li, Deqing; Russell, Jacob; Subramanian, Suresh

    2012-05-01

    The major focus in the field of modeling & simulation for network centric environments has been on the physical layer while making simplifications for the human-in-the-loop. However, the human element has a big impact on the capabilities of network centric systems. Taking into account the socio-behavioral aspects of processes such as team building, group decision-making, etc. are critical to realistically modeling and analyzing system performance. Modeling socio-cultural processes is a challenge because of the complexity of the networks, dynamism in the physical and social layers, feedback loops and uncertainty in the modeling data. We propose an overarching framework to represent, model and analyze various socio-cultural processes within network centric environments. The key innovation in our methodology is to simultaneously model the dynamism in both the physical and social layers while providing functional mappings between them. We represent socio-cultural information such as friendships, professional relationships and temperament by leveraging the Culturally Infused Social Network (CISN) framework. The notion of intent is used to relate the underlying socio-cultural factors to observed behavior. We will model intent using Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network, which can represent incomplete and uncertain socio-cultural information. We will leverage previous work on a network performance modeling framework called Network-Centric Operations Performance and Prediction (N-COPP) to incorporate dynamism in various aspects of the physical layer such as node mobility, transmission parameters, etc. We validate our framework by simulating a suitable scenario, incorporating relevant factors and providing analyses of the results.

  1. Distributed software framework and continuous integration in hydroinformatics systems

    NASA Astrophysics Data System (ADS)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  2. Development of an Electronic Portfolio System Success Model: An Information Systems Approach

    ERIC Educational Resources Information Center

    Balaban, Igor; Mu, Enrique; Divjak, Blazenka

    2013-01-01

    This research has two main goals: to develop an instrument for assessing Electronic Portfolio (ePortfolio) success and to build a corresponding ePortfolio success model using DeLone and McLean's information systems success model as the theoretical framework. For this purpose, we developed an ePortfolio success measurement instrument and structural…

  3. An active monitoring method for flood events

    NASA Astrophysics Data System (ADS)

    Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya

    2018-07-01

    Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

  4. Evaluating the Relevance, Reliability, and Applicability of CMIP5 Climate Projections for Water Resources and Environmental Planning

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Scott, J.; Ferguson, I. M.; Arnold, J.; Raff, D. A.; Webb, R. S.

    2012-12-01

    Water managers need to understand the applicability of climate projection information available for decision-support at the scale of their applications. Applicability depends on information reliability and relevance. This need to understand applicability stems from expectations that entities rationalize adaptation investments or decisions to delay investment. It is also occurring at a time when new global climate projections are being released through the World Climate Research Programme Coupled Model Intercomparison Project phase 5 (CMIP5), which introduces new information opportunities and interpretation challenges. This project involves an interagency collaboration to evaluate the applicability of CMIP5 projections for use in water and environmental resources planning. The overarching goal is to develop and demonstrate a framework that involves dual evaluations of relevance and reliability informing an ultimate discussion and judgment of applicability, which is expected to vary with decision-making context. The framework is being developed and demonstrated within the context of reservoir systems management in California's Sacramento and San Joaquin River basins. The relevance evaluation focuses on identifying the climate variables and statistical measures relevant to long-term management questions, which may depend on satisfying multiple objectives. Past studies' results are being considered in this evaluation, along with new results from system sensitivity analyses conducted through this effort. The reliability evaluation focuses on the CMIP5 climate models' ability to simulate past conditions relative to observed references. The evaluation is being conducted across the global domain using a large menu of climate variables and statistical measures, leveraging lessons learned from similar evaluations of CMIP3 climate models. The global focus addresses a broader project goal of producing a web resource that can serve reliability information to applicability discussions around the world, with evaluation results being served through a web-portal similar to that developed by NOAA/CIRES to serve CMIP3 information on future climate extremes (http://www.esrl.noaa.gov/psd/ipcc/extremes/). The framework concludes with an applicability discussion informed by relevance and reliability results. The goal is to observe the discussion process and identify features, choice points, and challenges that might be summarized and shared with other resource management groups facing applicability questions. This presentation will discuss the project framework and preliminary results. In addition to considering CMIP5 21st century projection information, the framework is being developed to support evaluation of CMIP5 decadal predictability experiment simulations and reconcile those simulations with 21st century projections. The presentation will also discuss implications of considering the applicability of bias-corrected and downscaled information within this framework.

  5. Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework

    NASA Astrophysics Data System (ADS)

    Hermawan; Hastarista, Fika

    2016-01-01

    Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.

  6. TLS and photogrammetry for the modeling of a historic wooden framework

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Viale, M.

    2012-04-01

    The building which is the object of the study is located in the center of Andlau, France. This mansion that was built in 1582 was the residence of the Lords of Andlau from the XVIth century until the French Revolution. Its architecture represents the Renaissance style of the XVIth century in particular by its volutes and its spiral staircase inside the polygonal turret. In January 2005, the municipality of Andlau became the owner of this Seigneury which is intended to welcome the future Heritage Interpretation Center (HIC), a museum is also going to be created there. Three levels of attic of this building are going to be restored and isolated, the historic framework will that way be masked and the last three levels will not be accessible any more. In this context, our lab was asked to model the framework to allow to make diagnoses there, to learn to know and to consolidate the knowledge on this type of historic framework. Finally, next to a virtual visualization, we provided other applications in particular the creation of an accurate 3D model of the framework for animations, as well as for foundation of an historical information system and for supplying the future museum and HIC with digital data. The project contains different phases: the data acquisition, the model creation and data structuring, the creation of an interactive model and the integration in a historic information system. All levels of the attic were acquired: a 3D Trimble GX scanner and partially a Trimble CX scanner were used in particular for the acquisition of data in the highest part of the framework. The various scans were directly georeferenced in the field thanks to control points, then merged together in an unique point cloud covering the whole structure. Several panoramic photos were also realized to create a virtual tour of the framework and the surroundings of the Seigneury. The purpose of the project was to supply a 3D model allowing the creation of scenographies and interactive contents which will be integrated into an informative device. That way, the public can easily visualize the framework, manipulate the 3D model, discover the construction and the various parts of the historical wooden structure. The raw point cloud cannot be used for this kind of applications. It is thus necessary, from the data which it supplies, to create an exploitable model. Several parameters are to be taken into account: the level of detail of the 3D model, the necessary time to model all the beams, the weight of the final files and finally the type of applied texture. The idea was to implement a workflow to reconcile these various criteria, several methods were tested. This project allowed to create a range of solutions (3D models of the complete framework, virtual tour, interactive 3D models, video animations) to allow an uninitiated public to take advantage of 3D material and software often reserved for the professionals. The work was completed by the comparison between a theoretical model of the framework and a more detailed model of the current state, which allowed to make diagnoses and to study the movements of the structure in the time and to supply important data for rehabilitation and renovation operations.

  7. A framework for evaluating forest restoration alternatives and their outcomes, over time, to inform monitoring: Bioregional inventory originated simulation under management

    Treesearch

    Jeremy S. Fried; Theresa B. Jain; Sara Loreno; Robert F. Keefe; Conor K. Bell

    2017-01-01

    The BioSum modeling framework summarizes current and prospective future forest conditions under alternative management regimes along with their costs, revenues and product yields. BioSum translates Forest Inventory and Analysis (FIA) data for input to the Forest Vegetation Simulator (FVS), summarizes FVS outputs for input to the treatment operations cost model (OpCost...

  8. Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework

    PubMed Central

    Talluto, Matthew V.; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C. Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A.; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique

    2016-01-01

    Aim Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Location Eastern North America (as an example). Methods Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple (Acer saccharum), an abundant tree native to eastern North America. Results For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. Main conclusions We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making. PMID:27499698

  9. Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework.

    PubMed

    Talluto, Matthew V; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique

    2016-02-01

    Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Eastern North America (as an example). Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple ( Acer saccharum ), an abundant tree native to eastern North America. For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making.

  10. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  11. Development of a flexible higher education curriculum framework for geographic information science

    NASA Astrophysics Data System (ADS)

    Veenendaal, B.

    2014-04-01

    A wide range of geographic information science (GIScience) educational programs currently exist, the oldest now over 25 years. Offerings vary from those specifically focussed on geographic information science, to those that utilise geographic information systems in various applications and disciplines. Over the past two decades, there have been a number of initiatives to design curricula for GIScience, including the NCGIA Core Curriculum, GIS&T Body of Knowledge and the Geospatial Technology Competency Model developments. The rapid developments in geospatial technology, applications and organisations means that curricula need to constantly be updated and developed to maintain currency and relevance. This paper reviews the curriculum initiatives and outlines a new and flexible GIScience higher education curriculum framework which complements and utilises existing curricula. This new framework was applied to the GIScience programs at Curtin University in Perth, Australia which has surpassed 25 years of GIScience education. Some of the results of applying this framework are outlined and discussed.

  12. Conceptualising paediatric health disparities: a metanarrative systematic review and unified conceptual framework.

    PubMed

    Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J

    2017-08-04

    There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Multilevel analysis of sports video sequences

    NASA Astrophysics Data System (ADS)

    Han, Jungong; Farin, Dirk; de With, Peter H. N.

    2006-01-01

    We propose a fully automatic and flexible framework for analysis and summarization of tennis broadcast video sequences, using visual features and specific game-context knowledge. Our framework can analyze a tennis video sequence at three levels, which provides a broad range of different analysis results. The proposed framework includes novel pixel-level and object-level tennis video processing algorithms, such as a moving-player detection taking both the color and the court (playing-field) information into account, and a player-position tracking algorithm based on a 3-D camera model. Additionally, we employ scene-level models for detecting events, like service, base-line rally and net-approach, based on a number real-world visual features. The system can summarize three forms of information: (1) all court-view playing frames in a game, (2) the moving trajectory and real-speed of each player, as well as relative position between the player and the court, (3) the semantic event segments in a game. The proposed framework is flexible in choosing the level of analysis that is desired. It is effective because the framework makes use of several visual cues obtained from the real-world domain to model important events like service, thereby increasing the accuracy of the scene-level analysis. The paper presents attractive experimental results highlighting the system efficiency and analysis capabilities.

  14. Annotation of rule-based models with formal semantics to enable creation, analysis, reuse and visualization.

    PubMed

    Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil

    2016-03-15

    Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. The annotation ontology for rule-based models can be found at http://purl.org/rbm/rbmo The krdf tool and associated executable examples are available at http://purl.org/rbm/rbmo/krdf anil.wipat@newcastle.ac.uk or vdanos@inf.ed.ac.uk. © The Author 2015. Published by Oxford University Press.

  15. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less

  16. Modelling Method of Recursive Entity

    ERIC Educational Resources Information Center

    Amal, Rifai; Messoussi, Rochdi

    2012-01-01

    With the development of the Information and Communication Technologies, great masses of information are published in the Web. In order to reuse, to share and to organise them in distance formation and e-learning frameworks, several research projects have been achieved and various standards and modelling languages developed. In our previous…

  17. Physiome-model-based state-space framework for cardiac deformation recovery.

    PubMed

    Wong, Ken C L; Zhang, Heye; Liu, Huafeng; Shi, Pengcheng

    2007-11-01

    To more reliably recover cardiac information from noise-corrupted, patient-specific measurements, it is essential to employ meaningful constraining models and adopt appropriate optimization criteria to couple the models with the measurements. Although biomechanical models have been extensively used for myocardial motion recovery with encouraging results, the passive nature of such constraints limits their ability to fully count for the deformation caused by active forces of the myocytes. To overcome such limitations, we propose to adopt a cardiac physiome model as the prior constraint for cardiac motion analysis. The cardiac physiome model comprises an electric wave propagation model, an electromechanical coupling model, and a biomechanical model, which are connected through a cardiac system dynamics for a more complete description of the macroscopic cardiac physiology. Embedded within a multiframe state-space framework, the uncertainties of the model and the patient's measurements are systematically dealt with to arrive at optimal cardiac kinematic estimates and possibly beyond. Experiments have been conducted to compare our proposed cardiac-physiome-model-based framework with the solely biomechanical model-based framework. The results show that our proposed framework recovers more accurate cardiac deformation from synthetic data and obtains more sensible estimates from real magnetic resonance image sequences. With the active components introduced by the cardiac physiome model, cardiac deformations recovered from patient's medical images are more physiologically plausible.

  18. Framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.

    2016-06-25

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the security and resilience of the electric power grid. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be difficult, as the statistical basis needed to directly estimate the probabilities and consequences of their occurrence does not exist. Because risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact, low-frequency events (HILFs) ismore » essential. Insights from such a model indicate where resources are most rationally and effectively expended. A risk-informed realization of designing and maintaining a grid resilient to HILFs will demand consideration of a spectrum of hazards/threats to infrastructure integrity, an understanding of their likelihoods of occurrence, treatment of the fragilities of critical assets to the stressors induced by such events, and through modeling grid network topology, the extent of damage associated with these scenarios. The model resulting from integration of these elements will allow sensitivity assessments based on optional risk management strategies, such as alternative pooling, staging and logistic strategies, and emergency contingency planning. This study is focused on the development of an end-to-end HILF risk-assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision-makers across numerous stakeholder groups in directing resources optimally towards the management of risks to operational continuity.« less

  19. Narratives of community engagement: a systematic review-derived conceptual framework for public health interventions.

    PubMed

    Brunton, Ginny; Thomas, James; O'Mara-Eves, Alison; Jamal, Farah; Oliver, Sandy; Kavanagh, Josephine

    2017-12-11

    Government policy increasingly supports engaging communities to promote health. It is critical to consider whether such strategies are effective, for whom, and under what circumstances. However, 'community engagement' is defined in diverse ways and employed for different reasons. Considering the theory and context we developed a conceptual framework which informs understanding about what makes an effective (or ineffective) community engagement intervention. We conducted a systematic review of community engagement in public health interventions using: stakeholder involvement; searching, screening, appraisal and coding of research literature; and iterative thematic syntheses and meta-analysis. A conceptual framework of community engagement was refined, following interactions between the framework and each review stage. From 335 included reports, three products emerged: (1) two strong theoretical 'meta-narratives': one, concerning the theory and practice of empowerment/engagement as an independent objective; and a more utilitarian perspective optimally configuring health services to achieve defined outcomes. These informed (2) models that were operationalized in subsequent meta-analysis. Both refined (3) the final conceptual framework. This identified multiple dimensions by which community engagement interventions may differ. Diverse combinations of intervention purpose, theory and implementation were noted, including: ways of defining communities and health needs; initial motivations for community engagement; types of participation; conditions and actions necessary for engagement; and potential issues influencing impact. Some dimensions consistently co-occurred, leading to three overarching models of effective engagement which either: utilised peer-led delivery; employed varying degrees of collaboration between communities and health services; or built on empowerment philosophies. Our conceptual framework and models are useful tools for considering appropriate and effective approaches to community engagement. These should be tested and adapted to facilitate intervention design and evaluation. Using this framework may disentangle the relative effectiveness of different models of community engagement, promoting effective, sustainable and appropriate initiatives.

  20. Parametric models to relate spike train and LFP dynamics with neural information processing.

    PubMed

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.

  1. Informing the Gestalt: An Ethical Framework for Allocating Scarce Federal Public Health and Medical Resources to States During Disasters

    PubMed Central

    Knebel, Ann R.; Sharpe, Virginia A.; Danis, Marion; Toomey, Lauren M.; Knickerbocker, Deborah K.

    2017-01-01

    During catastrophic disasters, government leaders must decide how to efficiently and effectively allocate scarce public health and medical resources. The literature about triage decision making at the individual patient level is substantial, and the National Response Framework provides guidance about the distribution of responsibilities between federal and state governments. However, little has been written about the decision-making process of federal leaders in disaster situations when resources are not sufficient to meet the needs of several states simultaneously. We offer an ethical framework and logic model for decision making in such circumstances. We adapted medical triage and the federalism principle to the decision-making process for allocating scarce federal public health and medical resources. We believe that the logic model provides a values-based framework that can inform the gestalt during the iterative decision process used by federal leaders as they allocate scarce resources to states during catastrophic disasters. PMID:24612854

  2. An Optimization-Based State Estimatioin Framework for Large-Scale Natural Gas Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Zavala, Victor M.

    We propose an optimization-based state estimation framework to track internal spacetime flow and pressure profiles of natural gas networks during dynamic transients. We find that the estimation problem is ill-posed (because of the infinite-dimensional nature of the states) and that this leads to instability of the estimator when short estimation horizons are used. To circumvent this issue, we propose moving horizon strategies that incorporate prior information. In particular, we propose a strategy that initializes the prior using steady-state information and compare its performance against a strategy that does not initialize the prior. We find that both strategies are capable ofmore » tracking the state profiles but we also find that superior performance is obtained with steady-state prior initialization. We also find that, under the proposed framework, pressure sensor information at junctions is sufficient to track the state profiles. We also derive approximate transport models and show that some of these can be used to achieve significant computational speed-ups without sacrificing estimation performance. We show that the estimator can be easily implemented in the graph-based modeling framework Plasmo.jl and use a multipipeline network study to demonstrate the developments.« less

  3. A KPI framework for process-based benchmarking of hospital information systems.

    PubMed

    Jahn, Franziska; Winter, Alfred

    2011-01-01

    Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.

  4. Applying Task-Technology Fit Model to the Healthcare Sector: a Case Study of Hospitals' Computed Tomography Patient-Referral Mechanism.

    PubMed

    Chen, Ping-Shun; Yu, Chun-Jen; Chen, Gary Yu-Hsin

    2015-08-01

    With the growth in the number of elderly and people with chronic diseases, the number of hospital services will need to increase in the near future. With myriad of information technologies utilized daily and crucial information-sharing tasks performed at hospitals, understanding the relationship between task performance and information system has become a critical topic. This research explored the resource pooling of hospital management and considered a computed tomography (CT) patient-referral mechanism between two hospitals using the information system theory framework of Task-Technology Fit (TTF) model. The TTF model could be used to assess the 'match' between the task and technology characteristics. The patient-referral process involved an integrated information framework consisting of a hospital information system (HIS), radiology information system (RIS), and picture archiving and communication system (PACS). A formal interview was conducted with the director of the case image center on the applicable characteristics of TTF model. Next, the Icam DEFinition (IDEF0) method was utilized to depict the As-Is and To-Be models for CT patient-referral medical operational processes. Further, the study used the 'leagility' concept to remove non-value-added activities and increase the agility of hospitals. The results indicated that hospital information systems could support the CT patient-referral mechanism, increase hospital performance, reduce patient wait time, and enhance the quality of care for patients.

  5. Modeling abundance using multinomial N-mixture models

    USGS Publications Warehouse

    Royle, Andy

    2016-01-01

    Multinomial N-mixture models are a generalization of the binomial N-mixture models described in Chapter 6 to allow for more complex and informative sampling protocols beyond simple counts. Many commonly used protocols such as multiple observer sampling, removal sampling, and capture-recapture produce a multivariate count frequency that has a multinomial distribution and for which multinomial N-mixture models can be developed. Such protocols typically result in more precise estimates than binomial mixture models because they provide direct information about parameters of the observation process. We demonstrate the analysis of these models in BUGS using several distinct formulations that afford great flexibility in the types of models that can be developed, and we demonstrate likelihood analysis using the unmarked package. Spatially stratified capture-recapture models are one class of models that fall into the multinomial N-mixture framework, and we discuss analysis of stratified versions of classical models such as model Mb, Mh and other classes of models that are only possible to describe within the multinomial N-mixture framework.

  6. Technology-induced errors. The current use of frameworks and models from the biomedical and life sciences literatures.

    PubMed

    Borycki, E M; Kushniruk, A W; Bellwood, P; Brender, J

    2012-01-01

    The objective of this paper is to examine the extent, range and scope to which frameworks, models and theories dealing with technology-induced error have arisen in the biomedical and life sciences literature as indexed by Medline®. To better understand the state of work in the area of technology-induced error involving frameworks, models and theories, the authors conducted a search of Medline® using selected key words identified from seminal articles in this research area. Articles were reviewed and those pertaining to frameworks, models or theories dealing with technology-induced error were further reviewed by two researchers. All articles from Medline® from its inception to April of 2011 were searched using the above outlined strategy. 239 citations were returned. Each of the abstracts for the 239 citations were reviewed by two researchers. Eleven articles met the criteria based on abstract review. These 11 articles were downloaded for further in-depth review. The majority of the articles obtained describe frameworks and models with reference to theories developed in other literatures outside of healthcare. The papers were grouped into several areas. It was found that articles drew mainly from three literatures: 1) the human factors literature (including human-computer interaction and cognition), 2) the organizational behavior/sociotechnical literature, and 3) the software engineering literature. A variety of frameworks and models were found in the biomedical and life sciences literatures. These frameworks and models drew upon and extended frameworks, models and theoretical perspectives that have emerged in other literatures. These frameworks and models are informing an emerging line of research in health and biomedical informatics involving technology-induced errors in healthcare.

  7. Surgical wound dehiscence: a conceptual framework for patient assessment.

    PubMed

    Sandy-Hodgetts, Kylie; Carville, Keryln; Leslie, Gavin D

    2018-03-02

    This paper presents a conceptual framework which outlines the risk factors associated with surgical wound dehiscence (SWD) as identified in the literature. The purpose for the development of the conceptual framework was to derive an evidence-based, informed understanding of factors associated with SWD, in order to inform a programme of research on the aetiology and potential risk factors of SWD. Incorporated within the patient-centric conceptual framework are patient related comorbidities, intraoperative and postoperative risk factors related to SWD. These are categorised as either 'mechanical' or 'physiological mechanisms' posited to influence these relationships. The use of the conceptual model for assessment of patients has particular clinical relevance for identification of risk and the management of patients in the pre-, intra- and postoperative period.

  8. An RFID-Based Manufacturing Control Framework for Loosely Coupled Distributed Manufacturing System Supporting Mass Customization

    NASA Astrophysics Data System (ADS)

    Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur

    In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.

  9. Tailored and Integrated Web-Based Tools for Improving Psychosocial Outcomes of Cancer Patients: The DoTTI Development Framework

    PubMed Central

    Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-01-01

    Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991

  10. Tailored and integrated Web-based tools for improving psychosocial outcomes of cancer patients: the DoTTI development framework.

    PubMed

    Smits, Rochelle; Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-03-14

    Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases.

  11. Advanced Performance Modeling with Combined Passive and Active Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dovrolis, Constantine; Sim, Alex

    2015-04-15

    To improve the efficiency of resource utilization and scheduling of scientific data transfers on high-speed networks, the "Advanced Performance Modeling with combined passive and active monitoring" (APM) project investigates and models a general-purpose, reusable and expandable network performance estimation framework. The predictive estimation model and the framework will be helpful in optimizing the performance and utilization of networks as well as sharing resources with predictable performance for scientific collaborations, especially in data intensive applications. Our prediction model utilizes historical network performance information from various network activity logs as well as live streaming measurements from network peering devices. Historical network performancemore » information is used without putting extra load on the resources by active measurement collection. Performance measurements collected by active probing is used judiciously for improving the accuracy of predictions.« less

  12. Selective 4D modelling framework for spatial-temporal land information management system

    NASA Astrophysics Data System (ADS)

    Doulamis, Anastasios; Soile, Sofia; Doulamis, Nikolaos; Chrisouli, Christina; Grammalidis, Nikos; Dimitropoulos, Kosmas; Manesis, Charalambos; Potsiou, Chryssy; Ioannidis, Charalabos

    2015-06-01

    This paper introduces a predictive (selective) 4D modelling framework where only the spatial 3D differences are modelled at the forthcoming time instances, while regions of no significant spatial-temporal alterations remain intact. To accomplish this, initially spatial-temporal analysis is applied between 3D digital models captured at different time instances. So, the creation of dynamic change history maps is made. Change history maps indicate spatial probabilities of regions needed further 3D modelling at forthcoming instances. Thus, change history maps are good examples for a predictive assessment, that is, to localize surfaces within the objects where a high accuracy reconstruction process needs to be activated at the forthcoming time instances. The proposed 4D Land Information Management System (LIMS) is implemented using open interoperable standards based on the CityGML framework. CityGML allows the description of the semantic metadata information and the rights of the land resources. Visualization aspects are also supported to allow easy manipulation, interaction and representation of the 4D LIMS digital parcels and the respective semantic information. The open source 3DCityDB incorporating a PostgreSQL geo-database is used to manage and manipulate 3D data and their semantics. An application is made to detect the change through time of a 3D block of plots in an urban area of Athens, Greece. Starting with an accurate 3D model of the buildings in 1983, a change history map is created using automated dense image matching on aerial photos of 2010. For both time instances meshes are created and through their comparison the changes are detected.

  13. Methodology Evaluation Framework for Component-Based System Development.

    ERIC Educational Resources Information Center

    Dahanayake, Ajantha; Sol, Henk; Stojanovic, Zoran

    2003-01-01

    Explains component-based development (CBD) for distributed information systems and presents an evaluation framework, which highlights the extent to which a methodology is component oriented. Compares prominent CBD methods, discusses ways of modeling, and suggests that this is a first step towards a components-oriented systems development…

  14. Derivation and Implementation of a Model Teaching the Nature of Science Using Informal Science Education Venues

    ERIC Educational Resources Information Center

    Spector, Barbara S.; Burkett, Ruth; Leard, Cyndy

    2012-01-01

    This paper introduces a model for using informal science education venues as contexts within which to teach the nature of science. The model was initially developed to enable university education students to teach science in elementary schools so as to be consistent with "National Science Education Standards" (NSES) (1996) and "A Framework for…

  15. Database integration in a multimedia-modeling environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorow, Kevin E.

    2002-09-02

    Integration of data from disparate remote sources has direct applicability to modeling, which can support Brownfield assessments. To accomplish this task, a data integration framework needs to be established. A key element in this framework is the metadata that creates the relationship between the pieces of information that are important in the multimedia modeling environment and the information that is stored in the remote data source. The design philosophy is to allow modelers and database owners to collaborate by defining this metadata in such a way that allows interaction between their components. The main parts of this framework include toolsmore » to facilitate metadata definition, database extraction plan creation, automated extraction plan execution / data retrieval, and a central clearing house for metadata and modeling / database resources. Cross-platform compatibility (using Java) and standard communications protocols (http / https) allow these parts to run in a wide variety of computing environments (Local Area Networks, Internet, etc.), and, therefore, this framework provides many benefits. Because of the specific data relationships described in the metadata, the amount of data that have to be transferred is kept to a minimum (only the data that fulfill a specific request are provided as opposed to transferring the complete contents of a data source). This allows for real-time data extraction from the actual source. Also, the framework sets up collaborative responsibilities such that the different types of participants have control over the areas in which they have domain knowledge-the modelers are responsible for defining the data relevant to their models, while the database owners are responsible for mapping the contents of the database using the metadata definitions. Finally, the data extraction mechanism allows for the ability to control access to the data and what data are made available.« less

  16. Multimodal Speaker Diarization.

    PubMed

    Noulas, A; Englebienne, G; Krose, B J A

    2012-01-01

    We present a novel probabilistic framework that fuses information coming from the audio and video modality to perform speaker diarization. The proposed framework is a Dynamic Bayesian Network (DBN) that is an extension of a factorial Hidden Markov Model (fHMM) and models the people appearing in an audiovisual recording as multimodal entities that generate observations in the audio stream, the video stream, and the joint audiovisual space. The framework is very robust to different contexts, makes no assumptions about the location of the recording equipment, and does not require labeled training data as it acquires the model parameters using the Expectation Maximization (EM) algorithm. We apply the proposed model to two meeting videos and a news broadcast video, all of which come from publicly available data sets. The results acquired in speaker diarization are in favor of the proposed multimodal framework, which outperforms the single modality analysis results and improves over the state-of-the-art audio-based speaker diarization.

  17. Merging Information Literacy and Evidence-Based Practice in an Undergraduate Health Sciences Curriculum Map

    ERIC Educational Resources Information Center

    Franzen, Susan; Bannon, Colleen M.

    2016-01-01

    The ACRL's "Framework for Information Literacy for Higher Education" offers the opportunity to rethink information literacy teaching and curriculum. However, the ACRL's rescinded "Information Literacy Competency Standards for Higher Education" correlate with the preferred research and decision-making model of the health…

  18. Rapid development of entity-based data models for bioinformatics with persistence object-oriented design and structured interfaces.

    PubMed

    Ezra Tsur, Elishai

    2017-01-01

    Databases are imperative for research in bioinformatics and computational biology. Current challenges in database design include data heterogeneity and context-dependent interconnections between data entities. These challenges drove the development of unified data interfaces and specialized databases. The curation of specialized databases is an ever-growing challenge due to the introduction of new data sources and the emergence of new relational connections between established datasets. Here, an open-source framework for the curation of specialized databases is proposed. The framework supports user-designed models of data encapsulation, objects persistency and structured interfaces to local and external data sources such as MalaCards, Biomodels and the National Centre for Biotechnology Information (NCBI) databases. The proposed framework was implemented using Java as the development environment, EclipseLink as the data persistency agent and Apache Derby as the database manager. Syntactic analysis was based on J3D, jsoup, Apache Commons and w3c.dom open libraries. Finally, a construction of a specialized database for aneurysms associated vascular diseases is demonstrated. This database contains 3-dimensional geometries of aneurysms, patient's clinical information, articles, biological models, related diseases and our recently published model of aneurysms' risk of rapture. Framework is available in: http://nbel-lab.com.

  19. Novel flood risk assessment framework for rapid decision making

    NASA Astrophysics Data System (ADS)

    Valyrakis, Manousos; Koursari, Eftychia; Solley, Mark

    2016-04-01

    The impacts of catastrophic flooding, have significantly increased over the last few decades. This is due to primarily the increased urbanisation in ever-expanding mega-cities as well as due to the intensification both in magnitude and frequency of extreme hydrologic events. Herein a novel conceptual framework is presented that incorporates the use of real-time information to inform and update low dimensionality hydraulic models, to allow for rapid decision making towards preventing loss of life and safeguarding critical infrastructure. In particular, a case study from the recent UK floods in the area of Whitesands (Dumfries), is presented to demonstrate the utility of this approach. It is demonstrated that effectively combining a wealth of readily available qualitative information (such as crowdsourced visual documentation or using live data from sensing techniques), with existing quantitative data, can help appropriately update hydraulic models and reduce modelling uncertainties in future flood risk assessments. This approach is even more useful in cases where hydraulic models are limited, do not exist or were not needed before unpredicted dynamic modifications to the river system took place (for example in the case of reduced or eliminated hydraulic capacity due to blockages). The low computational cost and rapid assessment this framework offers, render it promising for innovating in flood management.

  20. Theoretical framework to study exercise motivation for breast cancer risk reduction.

    PubMed

    Wood, Maureen E

    2008-01-01

    To identify an appropriate theoretical framework to study exercise motivation for breast cancer risk reduction among high-risk women. An extensive review of the literature was conducted to gather relevant information pertaining to the Health Promotion Model, self-determination theory, social cognitive theory, Health Belief Model, Transtheoretical Model, theory of planned behavior, and protection motivation theory. An iterative approach was used to summarize the literature related to exercise motivation within each theoretical framework. Protection motivation theory could be used to examine the effects of perceived risk and self-efficacy in motivating women to exercise to facilitate health-related behavioral change. Evidence-based research within a chosen theoretical model can aid practitioners when making practical recommendations to reduce breast cancer risk.

  1. Supporting the Evaluation and Implementation of Musculoskeletal Models of Care: A Globally Informed Framework for Judging Readiness and Success.

    PubMed

    Briggs, Andrew M; Jordan, Joanne E; Jennings, Matthew; Speerin, Robyn; Bragge, Peter; Chua, Jason; Woolf, Anthony D; Slater, Helen

    2017-04-01

    To develop a globally informed framework to evaluate readiness for implementation and success after implementation of musculoskeletal models of care (MOCs). Three phases were undertaken: 1) a qualitative study with 27 Australian subject matter experts (SMEs) to develop a draft framework; 2) an eDelphi study with an international panel of 93 SMEs across 30 nations to evaluate face validity, and refine and establish consensus on the framework components; and 3) translation of the framework into a user-focused resource and evaluation of its acceptability with the eDelphi panel. A comprehensive evaluation framework was developed for judging the readiness and success of musculoskeletal MOCs. The framework consists of 9 domains, with each domain containing a number of themes underpinned by detailed elements. In the first Delphi round, scores of "partly agree" or "completely agree" with the draft framework ranged 96.7%-100%. In the second round, "essential" scores ranged 58.6%-98.9%, resulting in 14 of 34 themes being classified as essential. SMEs strongly agreed or agreed that the final framework was useful (98.8%), usable (95.1%), credible (100%) and appealing (93.9%). Overall, 96.3% strongly supported or supported the final structure of the framework as it was presented, while 100%, 96.3%, and 100% strongly supported or supported the content within the readiness, initiating implementation, and success streams, respectively. An empirically derived framework to evaluate the readiness and success of musculoskeletal MOCs was strongly supported by an international panel of SMEs. The framework provides an important internationally applicable benchmark for the development, implementation, and evaluation of musculoskeletal MOCs. © 2016, American College of Rheumatology.

  2. Hierarchical species distribution models

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.

    2016-01-01

    Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.

  3. An Investigation of Information-Seeking Behaviour of Geography Teachers for an Information Service Intervention: The Case of Lesotho

    ERIC Educational Resources Information Center

    Bitso, Constance; Fourie, Ina

    2012-01-01

    Introduction: This study investigates the information needs and information-seeking patterns of secondary level geography teachers in Lesotho to guide the design and implementation of an information service for these teachers. Leckie, Pettigrew and Sylvain's model of professionals' information-seeking served as a theoretical framework but was…

  4. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gettelman, Andrew

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  5. Building an automated SOAP classifier for emergency department reports.

    PubMed

    Mowery, Danielle; Wiebe, Janyce; Visweswaran, Shyam; Harkema, Henk; Chapman, Wendy W

    2012-02-01

    Information extraction applications that extract structured event and entity information from unstructured text can leverage knowledge of clinical report structure to improve performance. The Subjective, Objective, Assessment, Plan (SOAP) framework, used to structure progress notes to facilitate problem-specific, clinical decision making by physicians, is one example of a well-known, canonical structure in the medical domain. Although its applicability to structuring data is understood, its contribution to information extraction tasks has not yet been determined. The first step to evaluating the SOAP framework's usefulness for clinical information extraction is to apply the model to clinical narratives and develop an automated SOAP classifier that classifies sentences from clinical reports. In this quantitative study, we applied the SOAP framework to sentences from emergency department reports, and trained and evaluated SOAP classifiers built with various linguistic features. We found the SOAP framework can be applied manually to emergency department reports with high agreement (Cohen's kappa coefficients over 0.70). Using a variety of features, we found classifiers for each SOAP class can be created with moderate to outstanding performance with F(1) scores of 93.9 (subjective), 94.5 (objective), 75.7 (assessment), and 77.0 (plan). We look forward to expanding the framework and applying the SOAP classification to clinical information extraction tasks. Copyright © 2011. Published by Elsevier Inc.

  6. Hospital enterprise Architecture Framework (Study of Iranian University Hospital Organization).

    PubMed

    Haghighathoseini, Atefehsadat; Bobarshad, Hossein; Saghafi, Fatehmeh; Rezaei, Mohammad Sadegh; Bagherzadeh, Nader

    2018-06-01

    Nowadays developing smart and fast services for patients and transforming hospitals to modern hospitals is considered a necessity. Living in the world inundated with information systems, designing services based on information technology entails a suitable architecture framework. This paper aims to present a localized enterprise architecture framework for the Iranian university hospital. Using two dimensions of implementation and having appropriate characteristics, the best 17 enterprises frameworks were chosen. As part of this effort, five criteria were selected according to experts' inputs. According to these criteria, five frameworks which had the highest rank were chosen. Then 44 general characteristics were extracted from the existing 17 frameworks after careful studying. Then a questionnaire was written accordingly to distinguish the necessity of those characteristics using expert's opinions and Delphi method. The result showed eight important criteria. In the next step, using AHP method, TOGAF was chosen regarding having appropriate characteristics and the ability to be implemented among reference formats. In the next step, enterprise architecture framework was designed by TOGAF in a conceptual model and its layers. For determining architecture framework parts, a questionnaire with 145 questions was written based on literature review and expert's opinions. The results showed during localization of TOGAF for Iran, 111 of 145 parts were chosen and certified to be used in the hospital. The results showed that TOGAF could be suitable for use in the hospital. So, a localized Hospital Enterprise Architecture Modelling is developed by customizing TOGAF for an Iranian hospital at eight levels and 11 parts. This new model could be used to be performed in other Iranian hospitals. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Translation from UML to Markov Model: A Performance Modeling Framework

    NASA Astrophysics Data System (ADS)

    Khan, Razib Hayat; Heegaard, Poul E.

    Performance engineering focuses on the quantitative investigation of the behavior of a system during the early phase of the system development life cycle. Bearing this on mind, we delineate a performance modeling framework of the application for communication system that proposes a translation process from high level UML notation to Continuous Time Markov Chain model (CTMC) and solves the model for relevant performance metrics. The framework utilizes UML collaborations, activity diagrams and deployment diagrams to be used for generating performance model for a communication system. The system dynamics will be captured by UML collaboration and activity diagram as reusable specification building blocks, while deployment diagram highlights the components of the system. The collaboration and activity show how reusable building blocks in the form of collaboration can compose together the service components through input and output pin by highlighting the behavior of the components and later a mapping between collaboration and system component identified by deployment diagram will be delineated. Moreover the UML models are annotated to associate performance related quality of service (QoS) information which is necessary for solving the performance model for relevant performance metrics through our proposed framework. The applicability of our proposed performance modeling framework in performance evaluation is delineated in the context of modeling a communication system.

  8. Bayesian calibration for electrochemical thermal model of lithium-ion cells

    NASA Astrophysics Data System (ADS)

    Tagade, Piyush; Hariharan, Krishnan S.; Basu, Suman; Verma, Mohan Kumar Singh; Kolake, Subramanya Mayya; Song, Taewon; Oh, Dukjin; Yeo, Taejung; Doo, Seokgwang

    2016-07-01

    Pseudo-two dimensional electrochemical thermal (P2D-ECT) model contains many parameters that are difficult to evaluate experimentally. Estimation of these model parameters is challenging due to computational cost and the transient model. Due to lack of complete physical understanding, this issue gets aggravated at extreme conditions like low temperature (LT) operations. This paper presents a Bayesian calibration framework for estimation of the P2D-ECT model parameters. The framework uses a matrix variate Gaussian process representation to obtain a computationally tractable formulation for calibration of the transient model. Performance of the framework is investigated for calibration of the P2D-ECT model across a range of temperatures (333 Ksbnd 263 K) and operating protocols. In the absence of complete physical understanding, the framework also quantifies structural uncertainty in the calibrated model. This information is used by the framework to test validity of the new physical phenomena before incorporation in the model. This capability is demonstrated by introducing temperature dependence on Bruggeman's coefficient and lithium plating formation at LT. With the incorporation of new physics, the calibrated P2D-ECT model accurately predicts the cell voltage with high confidence. The accurate predictions are used to obtain new insights into the low temperature lithium ion cell behavior.

  9. Depth Reconstruction from Single Images Using a Convolutional Neural Network and a Condition Random Field Model.

    PubMed

    Liu, Dan; Liu, Xuejun; Wu, Yiguang

    2018-04-24

    This paper presents an effective approach for depth reconstruction from a single image through the incorporation of semantic information and local details from the image. A unified framework for depth acquisition is constructed by joining a deep Convolutional Neural Network (CNN) and a continuous pairwise Conditional Random Field (CRF) model. Semantic information and relative depth trends of local regions inside the image are integrated into the framework. A deep CNN network is firstly used to automatically learn a hierarchical feature representation of the image. To get more local details in the image, the relative depth trends of local regions are incorporated into the network. Combined with semantic information of the image, a continuous pairwise CRF is then established and is used as the loss function of the unified model. Experiments on real scenes demonstrate that the proposed approach is effective and that the approach obtains satisfactory results.

  10. A multiscale framework based on the physiome markup languages for exploring the initiation of osteoarthritis at the bone-cartilage interface.

    PubMed

    Shim, Vickie B; Hunter, Peter J; Pivonka, Peter; Fernandez, Justin W

    2011-12-01

    The initiation of osteoarthritis (OA) has been linked to the onset and progression of pathologic mechanisms at the cartilage-bone interface. Most importantly, this degenerative disease involves cross-talk between the cartilage and subchondral bone environments, so an informative model should contain the complete complex. In order to evaluate this process, we have developed a multiscale model using the open-source ontologies developed for the Physiome Project with cartilage and bone descriptions at the cellular, micro, and macro levels. In this way, we can effectively model the influence of whole body loadings at the macro level and the influence of bone organization and architecture at the micro level, and have cell level processes that determine bone and cartilage remodeling. Cell information is then passed up the spatial scales to modify micro architecture and provide a macro spatial characterization of cartilage inflammation. We evaluate the framework by linking a common knee injury (anterior cruciate ligament deficiency) to proinflammatory mediators as a possible pathway to initiate OA. This framework provides a "virtual bone-cartilage" tool for evaluating hypotheses, treatment effects, and disease onset to inform and strengthen clinical studies.

  11. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    PubMed Central

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  12. GIS Application System Design Applied to Information Monitoring

    NASA Astrophysics Data System (ADS)

    Qun, Zhou; Yujin, Yuan; Yuena, Kang

    Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.

  13. A Multidirectional Model for Assessing Learning Disabled Students' Intelligence: An Information-Processing Framework.

    ERIC Educational Resources Information Center

    Swanson, H. Lee

    1982-01-01

    An information processing approach to the assessment of learning disabled students' intellectual performance is presented. The model is based on the assumption that intelligent behavior is comprised of a variety of problem- solving strategies. An account of child problem solving is explained and illustrated with a "thinking aloud" protocol.…

  14. Framework for Design of Traceability System on Organic Rice Certification

    NASA Astrophysics Data System (ADS)

    Purwandoko, P. B.; Seminar, K. B.; Sutrisno; Sugiyanta

    2018-05-01

    Nowadays, the preferences of organic products such as organic rice have been increased. It because of the people awareness of the healthy and eco-friendly food product consumption has grown. Therefore, it is very important to ensure organic quality of the product that will be produced. Certification is a series of process that holds to ensure the quality of products meets all criteria of organic standards. Currently, there is a problem that traceability information system for organic rice certification has been not available. The current system still conducts manually caused the loss of information during storage process. This paper aimed at developing a traceability framework on organic rice certification process. First, the main discussed issues are organic certification process. Second, unified modeling language (UML) is used to build the model of user requirement in order to develop traceability system for all actors in the certification process. Furthermore, the information captured model along certification process will be explained in this paper. The model shows the information flow that has to be recorded for each actor. Finally, the challenges in the implementation system will be discussed in this paper.

  15. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  16. Habitat Suitability Index Models: Muskellunge

    USGS Publications Warehouse

    Cook, Mark F.; Solomon, R. Charles

    1987-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the muskellunge (Esox masquinongy Mitchell). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  17. Towards A Topological Framework for Integrating Semantic Information Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael

    2014-09-07

    In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.

  18. Very Large Graphs for Information Extraction (VLG) Detection and Inference in the Presence of Uncertainty

    DTIC Science & Technology

    2015-09-21

    this framework, MIT LL carried out a one-year proof- of-concept study to determine the capabilities and challenges in the detection of anomalies in...extremely large graphs [5]. Under this effort, two real datasets were considered, and algorithms for data modeling and anomaly detection were developed...is required in a well-defined experimental framework for the detection of anomalies in very large graphs. This study is intended to inform future

  19. A Framework for Engaging Parents in Prevention

    ERIC Educational Resources Information Center

    Randolph, Karen A.; Fincham, Frank; Radey, Melissa

    2009-01-01

    The literature on engaging families in prevention programs is informed by the Health Beliefs Model (HBM), Theory of Reasoned Action (TRA), and Family Systems theory. Although useful, these frameworks have not facilitated the development of prevention-based practice strategies that recognize different levels of prevention (i.e., universal,…

  20. Designing and Evaluating Representations to Model Pedagogy

    ERIC Educational Resources Information Center

    Masterman, Elizabeth; Craft, Brock

    2013-01-01

    This article presents the case for a theory-informed approach to designing and evaluating representations for implementation in digital tools to support Learning Design, using the framework of epistemic efficacy as an example. This framework, which is rooted in the literature of cognitive psychology, is operationalised through dimensions of fit…

  1. Using Evidence-Centered Design to Create a Special Educator Observation System

    ERIC Educational Resources Information Center

    Johnson, Evelyn S.; Crawford, Angela R.; Moylan, Laura A.; Zheng, Yuzhu

    2018-01-01

    The Evidence-Centered Design (ECD) framework was used to create a special education teacher observation system, Recognizing Effective Special Education Teachers (RESET). Extensive reviews of research informed the domain analysis and modeling stages, and led to the conceptual framework in which effective special education teaching is…

  2. A satellite-driven, client-server hydro-economic model prototype for agricultural water management

    NASA Astrophysics Data System (ADS)

    Maneta, Marco; Kimball, John; He, Mingzhu; Payton Gardner, W.

    2017-04-01

    Anticipating agricultural water demand, land reallocation, and impact on farm revenues associated with different policy or climate constraints is a challenge for water managers and for policy makers. While current integrated decision support systems based on programming methods provide estimates of farmer reaction to external constraints, they have important shortcomings such as the high cost of data collection surveys necessary to calibrate the model, biases associated with inadequate farm sampling, infrequent model updates and recalibration, model overfitting, or their deterministic nature, among other problems. In addition, the administration of water supplies and the generation of policies that promote sustainable agricultural regions depend on more than one bureau or office. Unfortunately, managers from local and regional agencies often use different datasets of variable quality, which complicates coordinated action. To overcome these limitations, we present a client-server, integrated hydro-economic modeling and observation framework driven by satellite remote sensing and other ancillary information from regional monitoring networks. The core of the framework is a stochastic data assimilation system that sequentially ingests remote sensing observations and corrects the parameters of the hydro-economic model at unprecedented spatial and temporal resolutions. An economic model of agricultural production, based on mathematical programming, requires information on crop type and extent, crop yield, crop transpiration and irrigation technology. A regional hydro-climatologic model provides biophysical constraints to an economic model of agricultural production with a level of detail that permits the study of the spatial impact of large- and small-scale water use decisions. Crop type and extent is obtained from the Cropland Data Layer (CDL), which is multi-sensor operational classification of crops maintained by the United States Department of Agriculture. Because this product is only available for the conterminous United States, the framework is currently only applicable in this region. To obtain information on crop phenology, productivity and transpiration at adequate spatial and temporal frequencies we blend high spatial resolution Landsat information with high temporal fidelity MODIS imagery. The result is a 30 m, 8-day fused dataset of crop greenness that is subsequently transformed into productivity and transpiration by adapting existing forest productivity and transpiration algorithms for agricultural applications. To ensure all involved agencies work with identical information and that end-users are sheltered from the computational burden of storing and processing remote sensing data, this modeling framework is integrated in a client-server architecture based on the Hydra platform (www.hydraplatform.org). Assimilation and processing of resource-intensive remote sensing information, as well as hydrologic and other ancillary data, occur on the server side. With this architecture, our decision support system becomes a light weight 'app' that connects to the server to retrieve the latest information regarding water demands, land use, yields and hydrologic information required to run different management scenarios. This architecture ensures that all agencies and teams involved in water management use the same, up-to-date information in their simulations.

  3. REDD+ and climate smart agriculture in landscapes: A case study in Vietnam using companion modelling.

    PubMed

    Salvini, G; Ligtenberg, A; van Paassen, A; Bregt, A K; Avitabile, V; Herold, M

    2016-05-01

    Finding land use strategies that merge land-based climate change mitigation measures and adaptation strategies is still an open issue in climate discourse. This article explores synergies and trade-offs between REDD+, a scheme that focuses mainly on mitigation through forest conservation, with "Climate Smart Agriculture", an approach that emphasizes adaptive agriculture. We introduce a framework for ex-ante assessment of the impact of land management policies and interventions and for quantifying their impacts on land-based mitigation and adaptation goals. The framework includes a companion modelling (ComMod) process informed by interviews with policymakers, local experts and local farmers. The ComMod process consists of a Role-Playing Game with local farmers and an Agent Based Model. The game provided a participatory means to develop policy and climate change scenarios. These scenarios were then used as inputs to the Agent Based Model, a spatially explicit model to simulate landscape dynamics and the associated carbon emissions over decades. We applied the framework using as case study a community in central Vietnam, characterized by deforestation for subsistence agriculture and cultivation of acacias as a cash crop. The main findings show that the framework is useful in guiding consideration of local stakeholders' goals, needs and constraints. Additionally the framework provided beneficial information to policymakers, pointing to ways that policies might be re-designed to make them better tailored to local circumstances and therefore more effective in addressing synergistically climate change mitigation and adaptation objectives. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, Allison, E-mail: lewis.allison10@gmail.com; Smith, Ralph; Williams, Brian

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is tomore » employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.« less

  5. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 2: Framework process description

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.

  6. a Framework for Architectural Heritage Hbim Semantization and Development

    NASA Astrophysics Data System (ADS)

    Brusaporci, S.; Maiezza, P.; Tata, A.

    2018-05-01

    Despite the recognized advantages of the use of BIM in the field of architecture and engineering, the extension of this procedure to the architectural heritage is neither immediate nor critical. The uniqueness and irregularity of historical architecture, on the one hand, and the great quantity of information necessary for the knowledge of architectural heritage, on the other, require appropriate reflections. The aim of this paper is to define a general framework for the use of BIM procedures for architectural heritage. The proposed methodology consists of three different Level of Development (LoD), depending on the characteristics of the building and the objectives of the study: a simplified model with a low geometric accuracy and a minimum quantity of information (LoD 200); a model nearer to the reality but, however, with a high deviation between virtual and real model (LoD 300); a detailed BIM model that reproduce as much as possible the geometric irregularities of the building and is enriched by the maximum quantity of information available (LoD 400).

  7. Introduction into the Virtual Olympic Games Framework for online communities.

    PubMed

    Stoilescu, Dorian

    2009-06-01

    This paper presents the design of the Virtual Olympic Games Framework (VOGF), a computer application designated for athletics, health care, general well-being, nutrition and fitness, which offers multiple benefits for its participants. A special interest in starting the design of the framework was in exploring how people can connect and participate together using existing computer technologies (i.e. gaming consoles, exercise equipment with computer interfaces, devices of measuring health, speed, force and distance and Web 2.0 applications). A stationary bike set-up offering information to users about their individual health and athletic performances has been considered as a starting model. While this model is in the design stage, some preliminary findings are encouraging, suggesting the potential for various fields: sports, medicine, theories of learning, technologies and cybercultural studies. First, this framework would allow participants to perform a variety of sports and improve their health. Second, this would involve creating an online environment able to store health information and sport performances correlated with accessing multi-media data and research about performing sports. Third, participants could share experiences with other athletes, coaches and researchers. Fourth, this framework also provides support for the research community in their future investigations.

  8. A surety engineering framework to reduce cognitive systems risks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caudell, Thomas P.; Peercy, David Eugene; Caldera, Eva O.

    Cognitive science research investigates the advancement of human cognition and neuroscience capabilities. Addressing risks associated with these advancements can counter potential program failures, legal and ethical issues, constraints to scientific research, and product vulnerabilities. Survey results, focus group discussions, cognitive science experts, and surety researchers concur technical risks exist that could impact cognitive science research in areas such as medicine, privacy, human enhancement, law and policy, military applications, and national security (SAND2006-6895). This SAND report documents a surety engineering framework and a process for identifying cognitive system technical, ethical, legal and societal risks and applying appropriate surety methods to reducemore » such risks. The framework consists of several models: Specification, Design, Evaluation, Risk, and Maturity. Two detailed case studies are included to illustrate the use of the process and framework. Several Appendices provide detailed information on existing cognitive system architectures; ethical, legal, and societal risk research; surety methods and technologies; and educing information research with a case study vignette. The process and framework provide a model for how cognitive systems research and full-scale product development can apply surety engineering to reduce perceived and actual risks.« less

  9. The social determinants of oral health: new approaches to conceptualizing and researching complex causal networks.

    PubMed

    Newton, J Timothy; Bower, Elizabeth J

    2005-02-01

    Oral epidemiological research into the social determinants of oral health has been limited by the absence of a theoretical framework which reflects the complexity of real life social processes and the network of causal pathways between social structure and oral health and disease. In the absence of such a framework, social determinants are treated as isolated risk factors, attributable to the individual, having a direct impact on oral health. There is little sense of how such factors interrelate over time and place and the pathways between the factors and oral health. Features of social life which impact on individuals' oral health but are not reducible to the individual remain under-researched. A conceptual framework informing mainstream epidemiological research into the social determinants of health is applied to oral epidemiology. The framework suggests complex causal pathways between social structure and health via interlinking material, psychosocial and behavioural pathways. Methodological implications for oral epidemiological research informed by the framework, such as the use of multilevel modelling, path analysis and structural equation modelling, combining qualitative and quantitative research methods, and collaborative research, are discussed. Copyright Blackwell Munksgaard, 2005.

  10. Evaluating Health Information Systems Using Ontologies

    PubMed Central

    Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan

    2016-01-01

    Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735

  11. Evaluating Health Information Systems Using Ontologies.

    PubMed

    Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan

    2016-06-16

    There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.

  12. Use of Annotations for Component and Framework Interoperability

    NASA Astrophysics Data System (ADS)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.

  13. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed methodology generates realistic fault network models conditioned to data and a conceptual model of the underlying tectonics.

  14. An information maximization model of eye movements

    NASA Technical Reports Server (NTRS)

    Renninger, Laura Walker; Coughlan, James; Verghese, Preeti; Malik, Jitendra

    2005-01-01

    We propose a sequential information maximization model as a general strategy for programming eye movements. The model reconstructs high-resolution visual information from a sequence of fixations, taking into account the fall-off in resolution from the fovea to the periphery. From this framework we get a simple rule for predicting fixation sequences: after each fixation, fixate next at the location that minimizes uncertainty (maximizes information) about the stimulus. By comparing our model performance to human eye movement data and to predictions from a saliency and random model, we demonstrate that our model is best at predicting fixation locations. Modeling additional biological constraints will improve the prediction of fixation sequences. Our results suggest that information maximization is a useful principle for programming eye movements.

  15. Confirmatory Factor Analysis of the Patient Reported Outcomes Measurement Information System (PROMIS) Adult Domain Framework Using Item Response Theory Scores.

    PubMed

    Carle, Adam C; Riley, William; Hays, Ron D; Cella, David

    2015-10-01

    To guide measure development, National Institutes of Health-supported Patient reported Outcomes Measurement Information System (PROMIS) investigators developed a hierarchical domain framework. The framework specifies health domains at multiple levels. The initial PROMIS domain framework specified that physical function and symptoms such as Pain and Fatigue indicate Physical Health (PH); Depression, Anxiety, and Anger indicate Mental Health (MH); and Social Role Performance and Social Satisfaction indicate Social Health (SH). We used confirmatory factor analyses to evaluate the fit of the hypothesized framework to data collected from a large sample. We used data (n=14,098) from PROMIS's wave 1 field test and estimated domain scores using the PROMIS item response theory parameters. We then used confirmatory factor analyses to test whether the domains corresponded to the PROMIS domain framework as expected. A model corresponding to the domain framework did not provide ideal fit [root mean square error of approximation (RMSEA)=0.13; comparative fit index (CFI)=0.92; Tucker Lewis Index (TLI)=0.88; standardized root mean square residual (SRMR)=0.09]. On the basis of modification indices and exploratory factor analyses, we allowed Fatigue to load on both PH and MH. This model fit the data acceptably (RMSEA=0.08; CFI=0.97; TLI=0.96; SRMR=0.03). Our findings generally support the PROMIS domain framework. Allowing Fatigue to load on both PH and MH improved fit considerably.

  16. HIS/BUI: a conceptual model for bottom-up integration of hospital information systems.

    PubMed

    Zviran, M; Armoni, A; Glezer, C

    1998-06-01

    Many successful applications of information systems have been introduced and implemented in hospitals. However, the integration of these applications into a cohesive hospital-wide information system has proved to be more complicated to develop and difficult to accomplish than expected. This paper introduces HIS/BUI, a framework for bottom-up integration of hospital information systems, and demonstrates its application through a real-life case scenario. The scope of the proposed framework is the integration of heterogeneous clinical, administrative, and financial information elements of a hospital into a unified system environment. Under the integrated architecture, all existing local applications are preserved and interconnected to an information hub that serves as a central medical and administrative data warehouse.

  17. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    NASA Astrophysics Data System (ADS)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  18. Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.

    PubMed

    Gomez, Christophe; Hartung, Niklas

    2018-01-01

    Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.

  19. Bim-Gis Integrated Geospatial Information Model Using Semantic Web and Rdf Graphs

    NASA Astrophysics Data System (ADS)

    Hor, A.-H.; Jadidi, A.; Sohn, G.

    2016-06-01

    In recent years, 3D virtual indoor/outdoor urban modelling becomes a key spatial information framework for many civil and engineering applications such as evacuation planning, emergency and facility management. For accomplishing such sophisticate decision tasks, there is a large demands for building multi-scale and multi-sourced 3D urban models. Currently, Building Information Model (BIM) and Geographical Information Systems (GIS) are broadly used as the modelling sources. However, data sharing and exchanging information between two modelling domains is still a huge challenge; while the syntactic or semantic approaches do not fully provide exchanging of rich semantic and geometric information of BIM into GIS or vice-versa. This paper proposes a novel approach for integrating BIM and GIS using semantic web technologies and Resources Description Framework (RDF) graphs. The novelty of the proposed solution comes from the benefits of integrating BIM and GIS technologies into one unified model, so-called Integrated Geospatial Information Model (IGIM). The proposed approach consists of three main modules: BIM-RDF and GIS-RDF graphs construction, integrating of two RDF graphs, and query of information through IGIM-RDF graph using SPARQL. The IGIM generates queries from both the BIM and GIS RDF graphs resulting a semantically integrated model with entities representing both BIM classes and GIS feature objects with respect to the target-client application. The linkage between BIM-RDF and GIS-RDF is achieved through SPARQL endpoints and defined by a query using set of datasets and entity classes with complementary properties, relationships and geometries. To validate the proposed approach and its performance, a case study was also tested using IGIM system design.

  20. The Domain of Cognition: An Alternative to Bloom's Cognitive Domain within the Framework of an Information Processing Model.

    ERIC Educational Resources Information Center

    Stahl, Robert J.; Murphy, Gary T.

    Weaknesses in the structure, levels, and sequence of Bloom's taxonomy of cognitive domains emphasize the need for both a new model of how individual learners process information and a new taxonomy of the different levels of memory, thinking, and learning. Both the model and the taxonomy should be consistent with current research findings. The…

  1. Principles Underlying the Use of Multiple Informants’ Reports

    PubMed Central

    De Los Reyes, Andres; Thomas, Sarah A.; Goodman, Kimberly L.; Kundey, Shannon M.A.

    2014-01-01

    Researchers use multiple informants’ reports to assess and examine behavior. However, informants’ reports commonly disagree. Informants’ reports often disagree in their perceived levels of a behavior (“low” vs. “elevated” mood), and examining multiple reports in a single study often results in inconsistent findings. Although researchers often espouse taking a multi-informant assessment approach, they frequently address informant discrepancies using techniques that treat discrepancies as measurement error. Yet, recent work indicates that researchers in a variety of fields often may be unable to justify treating informant discrepancies as measurement error. In this paper, the authors advance a framework (Operations Triad Model) outlining general principles for using and interpreting informants’ reports. Using the framework, researchers can test whether or not they can extract meaningful information about behavior from discrepancies among multiple informants’ reports. The authors provide supportive evidence for this framework and discuss its implications for hypothesis testing, study design, and quantitative review. PMID:23140332

  2. A reference model for scientific information interchange

    NASA Technical Reports Server (NTRS)

    Reich, Lou; Sawyer, Don; Davis, Randy

    1993-01-01

    This paper presents an overview of an Information Interchange Reference Model (IIRM) currently being developed by individuals participating in the Consultative Committee for Space Data Systems (CCSDS) Panel 2, the Planetary Data Systems (PDS), and the Committee on Earth Observing Satellites (CEOS). This is an ongoing research activity and is not an official position by these bodies. This reference model provides a framework for describing and assessing current and proposed methodologies for information interchange within and among the space agencies. It is hoped that this model will improve interoperability between the various methodologies. As such, this model attempts to address key information interchange issues as seen by the producers and users of space-related data and to put them into a coherent framework. Information is understood as the knowledge (e.g., the scientific content) represented by data. Therefore, concern is not primarily on mechanisms for transferring data from user to user (e.g., compact disk read-only memory (CD-ROM), wide-area networks, optical tape, and so forth) but on how information is encoded as data and how the information content is maintained with minimal loss or distortion during transmittal. The model assumes open systems, which means that the protocols or methods used should be fully described and the descriptions publicly available. Ideally these protocols are promoted by recognized standards organizations using processes that permit involvement by those most likely to be affected, thereby enhancing the protocol's stability and the likelihood of wide support.

  3. Evolutionary scalpels for dissecting tumor ecosystems

    PubMed Central

    Rosenbloom, Daniel I. S.; Camara, Pablo G.; Chu, Tim; Rabadan, Raul

    2017-01-01

    Amidst the growing literature on cancer genomics and intratumor heterogeneity, essential principles in evolutionary biology recur time and time again. Here we use these principles to guide the reader through major advances in cancer research, highlighting issues of “hit hard, hit early” treatment strategies, drug resistance, and metastasis. We distinguish between two frameworks for understanding heterogeneous tumors, both of which can inform treatment strategies: (1) The tumor as diverse ecosystem, a Darwinian population of sometimes-competing, sometimes-cooperating cells; (2) The tumor as tightly integrated, self-regulating organ, which may hijack developmental signals to restore functional heterogeneity after treatment. While the first framework dominates literature on cancer evolution, the second framework enjoys support as well. Throughout this review, we illustrate how mathematical models inform understanding of tumor progression and treatment outcomes. Connecting models to genomic data faces computational and technical hurdles, but high-throughput single-cell technologies show promise to clear these hurdles. PMID:27923679

  4. Predicting disease progression from short biomarker series using expert advice algorithm

    NASA Astrophysics Data System (ADS)

    Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki

    2015-05-01

    Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of ``prediction with expert advice'' to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.

  5. Predicting disease progression from short biomarker series using expert advice algorithm.

    PubMed

    Morino, Kai; Hirata, Yoshito; Tomioka, Ryota; Kashima, Hisashi; Yamanishi, Kenji; Hayashi, Norihiro; Egawa, Shin; Aihara, Kazuyuki

    2015-05-20

    Well-trained clinicians may be able to provide diagnosis and prognosis from very short biomarker series using information and experience gained from previous patients. Although mathematical methods can potentially help clinicians to predict the progression of diseases, there is no method so far that estimates the patient state from very short time-series of a biomarker for making diagnosis and/or prognosis by employing the information of previous patients. Here, we propose a mathematical framework for integrating other patients' datasets to infer and predict the state of the disease in the current patient based on their short history. We extend a machine-learning framework of "prediction with expert advice" to deal with unstable dynamics. We construct this mathematical framework by combining expert advice with a mathematical model of prostate cancer. Our model predicted well the individual biomarker series of patients with prostate cancer that are used as clinical samples.

  6. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    USGS Publications Warehouse

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  7. Knowledge Interaction Design for Creative Knowledge Work

    NASA Astrophysics Data System (ADS)

    Nakakoji, Kumiyo; Yamamoto, Yasuhiro

    This paper describes our approach for the development of application systems for creative knowledge work, particularly for early stages of information design tasks. Being a cognitive tool serving as a means of externalization, an application system affects how the user is engaged in the creative process through its visual interaction design. Knowledge interaction design described in this paper is a framework where a set of application systems for different information design domains are developed based on an interaction model, which is designed for a particular model of a thinking process. We have developed two sets of application systems using the knowledge interaction design framework: one includes systems for linear information design, such as writing, movie-editing, and video-analysis; the other includes systems for network information design, such as file-system navigation and hypertext authoring. Our experience shows that the resulting systems encourage users to follow a certain cognitive path through graceful user experience.

  8. International Perspectives: Polish Post-Secondary Vocational Schools and Canadian Community Colleges: A Comparison Using an Information Technology Conceptual Model

    ERIC Educational Resources Information Center

    Butler, Norman L.; Davidson, Barry S.; Pachocinski, Ryszard; Griffith, Kimberly Grantham; Kritsonis, William Allan

    2007-01-01

    This study compares Polish post-secondary vocational institutions with Canadian community colleges using an information technology conceptual framework. The research concentrated upon programs in information technology delivered by one Polish school Cracow School of Information Technology and two Canadian community colleges Durham (Oshawa,…

  9. Inference in the Wild: A Framework for Human Situation Assessment and a Case Study of Air Combat.

    PubMed

    McAnally, Ken; Davey, Catherine; White, Daniel; Stimson, Murray; Mascaro, Steven; Korb, Kevin

    2018-06-24

    Situation awareness is a key construct in human factors and arises from a process of situation assessment (SA). SA comprises the perception of information, its integration with existing knowledge, the search for new information, and the prediction of the future state of the world, including the consequences of planned actions. Causal models implemented as Bayesian networks (BNs) are attractive for modeling all of these processes within a single, unified framework. We elicited declarative knowledge from two Royal Australian Air Force (RAAF) fighter pilots about the information sources used in the identification (ID) of airborne entities and the causal relationships between these sources. This knowledge was represented in a BN (the declarative model) that was evaluated against the performance of 19 RAAF fighter pilots in a low-fidelity simulation. Pilot behavior was well predicted by a simple associative model (the behavioral model) with only three attributes of ID. Search for information by pilots was largely compensatory and was near-optimal with respect to the behavioral model. The average revision of beliefs in response to evidence was close to Bayesian, but there was substantial variability. Together, these results demonstrate the value of BNs for modeling human SA. Copyright © 2018 Cognitive Science Society, Inc.

  10. Continuum Thinking and the Contexts of Personal Information Management

    ERIC Educational Resources Information Center

    Huvila, Isto; Eriksen, Jon; Häusner, Eva-Maria; Jansson, Ina-Maria

    2014-01-01

    Introduction: Recent personal information management literature has underlined the significance of the contextuality of personal information and its use. The present article discusses the applicability of the records continuum model and its generalisation, continuum thinking, as a theoretical framework for explicating the overlap and evolution of…

  11. Information Gatekeepers in California's Ethnic Communities. Final Report.

    ERIC Educational Resources Information Center

    Metoyer-Duran, Cheryl

    This study examined the information seeking behavior of gatekeepers in five ethnolinguistic communities: American Indian, Chinese, Japanese, Korean, and Latino. Personal interviews were conducted with 129 gatekeepers and a conceptual information model was used to generate an analytical framework for presenting and discussing the data. The major…

  12. Conceptual and Adoption of Technology Acceptance Model in Digital Information Resources Usage by Undergraduates: Implication to Higher Institutions Education in Delta and Edo of Nigeria

    ERIC Educational Resources Information Center

    Urhiewhu, Lucky Oghenetega; Emojorho, Daniel

    2015-01-01

    The article paper was on conceptual and theoretical framework of digital information resources usage by undergraduates: Implication to higher institutions education in Delta and Edo of Nigeria. It revealed the concept of digital information resources [DIRs] and model theory that related to the study. Finding shows that DIRs are use to low extent…

  13. Toward Model Building for Visual Aesthetic Perception

    PubMed Central

    Lughofer, Edwin; Zeng, Xianyi

    2017-01-01

    Several models of visual aesthetic perception have been proposed in recent years. Such models have drawn on investigations into the neural underpinnings of visual aesthetics, utilizing neurophysiological techniques and brain imaging techniques including functional magnetic resonance imaging, magnetoencephalography, and electroencephalography. The neural mechanisms underlying the aesthetic perception of the visual arts have been explained from the perspectives of neuropsychology, brain and cognitive science, informatics, and statistics. Although corresponding models have been constructed, the majority of these models contain elements that are difficult to be simulated or quantified using simple mathematical functions. In this review, we discuss the hypotheses, conceptions, and structures of six typical models for human aesthetic appreciation in the visual domain: the neuropsychological, information processing, mirror, quartet, and two hierarchical feed-forward layered models. Additionally, the neural foundation of aesthetic perception, appreciation, or judgement for each model is summarized. The development of a unified framework for the neurobiological mechanisms underlying the aesthetic perception of visual art and the validation of this framework via mathematical simulation is an interesting challenge in neuroaesthetics research. This review aims to provide information regarding the most promising proposals for bridging the gap between visual information processing and brain activity involved in aesthetic appreciation. PMID:29270194

  14. Working Group Reports: Working Group 1 - Software Systems Design and Implementation for Environmental Modeling

    EPA Science Inventory

    The purpose of the Interagency Steering Committee on Multimedia Environmental Modeling (ISCMEM) is to foster the exchange of information about environmental modeling tools, modeling frameworks, and environmental monitoring databases that are all in the public domain. It is compos...

  15. Modeling Real-Time Coordination of Distributed Expertise and Event Response in NASA Mission Control Center Operations

    NASA Astrophysics Data System (ADS)

    Onken, Jeffrey

    This dissertation introduces a multidisciplinary framework for the enabling of future research and analysis of alternatives for control centers for real-time operations of safety-critical systems. The multidisciplinary framework integrates functional and computational models that describe the dynamics in fundamental concepts of previously disparate engineering and psychology research disciplines, such as group performance and processes, supervisory control, situation awareness, events and delays, and expertise. The application in this dissertation is the real-time operations within the NASA Mission Control Center in Houston, TX. This dissertation operationalizes the framework into a model and simulation, which simulates the functional and computational models in the framework according to user-configured scenarios for a NASA human-spaceflight mission. The model and simulation generates data according to the effectiveness of the mission-control team in supporting the completion of mission objectives and detecting, isolating, and recovering from anomalies. Accompanying the multidisciplinary framework is a proof of concept, which demonstrates the feasibility of such a framework. The proof of concept demonstrates that variability occurs where expected based on the models. The proof of concept also demonstrates that the data generated from the model and simulation is useful for analyzing and comparing MCC configuration alternatives because an investigator can give a diverse set of scenarios to the simulation and the output compared in detail to inform decisions about the effect of MCC configurations on mission operations performance.

  16. Trial Implementation of a Multihazard Risk Assessment Framework for High-Impact Low-Frequency Power Grid Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.

    The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less

  17. Trial Implementation of a Multihazard Risk Assessment Framework for High-Impact Low-Frequency Power Grid Events

    DOE PAGES

    Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.; ...

    2017-08-25

    The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less

  18. Analyzing the U.S. Marine Corps Enterprise Information Technology Framework for IT Acquisition and Portfolio Governance

    DTIC Science & Technology

    2012-09-01

    Five Forces Model (Porter, 2008, p. 80) An example of the usefulness of these two frameworks is the case of IKEA , and its issues during the mid...from India and Pakistan, a proportion of which had been manufactured using child labor. IKEA spent years determining methods to ensure that none of...from the company’s value: a global news media, which could leverage networks and information, and present damaging issues to the public before IKEA

  19. Habitat Suitability Index Models: Hairy woodpecker

    USGS Publications Warehouse

    Sousa, Patrick J.

    1987-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the hairy woodpecker (Picoides villosus). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  20. Habitat Suitability Index Models: Diamondback terrapin (nesting) - Atlantic coast

    USGS Publications Warehouse

    Palmer, William M.; Cordes, Carroll L.

    1988-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the diamondback terrapin (Malaclemys terrapin). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  1. Habitat Suitability Index Models: Snapping turtle

    USGS Publications Warehouse

    Graves, Brent M.; Anderson, Stanley H.

    1987-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the snapping turtle (Chelydra serpentina). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) and 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  2. Habitat Suitability Index Models: Red-winged blackbird

    USGS Publications Warehouse

    Short, Henry L.

    1985-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the red-winged blackbird (Agelaius phoeniceus L.). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  3. Habitat Suitability Index Models: Spotted owl

    USGS Publications Warehouse

    Laymon, Stephen A.; Salwasser, Hal; Barrett, Reginald H.

    1985-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the spotted owl (Strix occidentalis). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  4. Investigating uncertainty and emotions in conversations about family health history: a test of the theory of motivated information management.

    PubMed

    Rauscher, Emily A; Hesse, Colin

    2014-01-01

    Although the importance of being knowledgeable of one's family health history is widely known, very little research has investigated how families communicate about this important topic. This study investigated how young adults seek information from parents about family health history. The authors used the Theory of Motivated Information Management as a framework to understand the process of uncertainty discrepancy and emotion in seeking information about family health history. Results of this study show the Theory of Motivated Information Management to be a good model to explain the process young adults go through in deciding to seek information from parents about family health history. Results also show that emotions other than anxiety can be used with success in the Theory of Motivated Information Management framework.

  5. NASA/DoD Aerospace Knowledge Diffusion Research Project. Paper 31: The information-seeking behavior of engineers

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Bishop, Ann P.; Barclay, Rebecca O.; Kennedy, John M.

    1993-01-01

    Engineers are an extraordinarily diverse group of professionals, but an attribute common to all engineers is their use of information. Engineering can be conceptualized as an information processing system that must deal with work-related uncertainty through patterns of technical communications. Throughout the process, data, information, and tacit knowledge are being acquired, produced, transferred, and utilized. While acknowledging that other models exist, we have chosen to view the information-seeking behavior of engineers within a conceptual framework of the engineer as an information processor. This article uses the chosen framework to discuss information-seeking behavior of engineers, reviewing selected literature and empirical studies from library and information science, management, communications, and sociology. The article concludes by proposing a research agenda designed to extend our current, limited knowledge of the way engineers process information.

  6. Proposal for a Spatial Organization Model in Soil Science (The Example of the European Communities Soil Map).

    ERIC Educational Resources Information Center

    King, D.; And Others

    1994-01-01

    Discusses the computational problems of automating paper-based spatial information. A new relational structure for soil science information based on the main conceptual concepts used during conventional cartographic work is proposed. This model is a computerized framework for coherent description of the geographical variability of soils, combined…

  7. Hierarchical models for informing general biomass equations with felled tree data

    Treesearch

    Brian J. Clough; Matthew B. Russell; Christopher W. Woodall; Grant M. Domke; Philip J. Radtke

    2015-01-01

    We present a hierarchical framework that uses a large multispecies felled tree database to inform a set of general models for predicting tree foliage biomass, with accompanying uncertainty, within the FIA database. Results suggest significant prediction uncertainty for individual trees and reveal higher errors when predicting foliage biomass for larger trees and for...

  8. Nonlocal sparse model with adaptive structural clustering for feature extraction of aero-engine bearings

    NASA Astrophysics Data System (ADS)

    Zhang, Han; Chen, Xuefeng; Du, Zhaohui; Li, Xiang; Yan, Ruqiang

    2016-04-01

    Fault information of aero-engine bearings presents two particular phenomena, i.e., waveform distortion and impulsive feature frequency band dispersion, which leads to a challenging problem for current techniques of bearing fault diagnosis. Moreover, although many progresses of sparse representation theory have been made in feature extraction of fault information, the theory also confronts inevitable performance degradation due to the fact that relatively weak fault information has not sufficiently prominent and sparse representations. Therefore, a novel nonlocal sparse model (coined NLSM) and its algorithm framework has been proposed in this paper, which goes beyond simple sparsity by introducing more intrinsic structures of feature information. This work adequately exploits the underlying prior information that feature information exhibits nonlocal self-similarity through clustering similar signal fragments and stacking them together into groups. Within this framework, the prior information is transformed into a regularization term and a sparse optimization problem, which could be solved through block coordinate descent method (BCD), is formulated. Additionally, the adaptive structural clustering sparse dictionary learning technique, which utilizes k-Nearest-Neighbor (kNN) clustering and principal component analysis (PCA) learning, is adopted to further enable sufficient sparsity of feature information. Moreover, the selection rule of regularization parameter and computational complexity are described in detail. The performance of the proposed framework is evaluated through numerical experiment and its superiority with respect to the state-of-the-art method in the field is demonstrated through the vibration signals of experimental rig of aircraft engine bearings.

  9. A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Basdekas, L.; Stewart, N.; Triana, E.

    2013-12-01

    Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU evaluate tradeoffs in a continually changing world.

  10. Environmental impact assessment of transportation projects: An analysis using an integrated GIS, remote sensing, and spatial modeling approach

    NASA Astrophysics Data System (ADS)

    El-Gafy, Mohamed Anwar

    Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.

  11. The ACTIVE conceptual framework as a structural equation model.

    PubMed

    Gross, Alden L; Payne, Brennan R; Casanova, Ramon; Davoudzadeh, Pega; Dzierzewski, Joseph M; Farias, Sarah; Giovannetti, Tania; Ip, Edward H; Marsiske, Michael; Rebok, George W; Schaie, K Warner; Thomas, Kelsey; Willis, Sherry; Jones, Richard N

    2018-01-01

    Background/Study Context: Conceptual frameworks are analytic models at a high level of abstraction. Their operationalization can inform randomized trial design and sample size considerations. The Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) conceptual framework was empirically tested using structural equation modeling (N=2,802). ACTIVE was guided by a conceptual framework for cognitive training in which proximal cognitive abilities (memory, inductive reasoning, speed of processing) mediate treatment-related improvement in primary outcomes (everyday problem-solving, difficulty with activities of daily living, everyday speed, driving difficulty), which in turn lead to improved secondary outcomes (health-related quality of life, health service utilization, mobility). Measurement models for each proximal, primary, and secondary outcome were developed and tested using baseline data. Each construct was then combined in one model to evaluate fit (RMSEA, CFI, normalized residuals of each indicator). To expand the conceptual model and potentially inform future trials, evidence of modification of structural model parameters was evaluated by age, years of education, sex, race, and self-rated health status. Preconceived measurement models for memory, reasoning, speed of processing, everyday problem-solving, instrumental activities of daily living (IADL) difficulty, everyday speed, driving difficulty, and health-related quality of life each fit well to the data (all RMSEA < .05; all CFI > .95). Fit of the full model was excellent (RMSEA = .038; CFI = .924). In contrast with previous findings from ACTIVE regarding who benefits from training, interaction testing revealed associations between proximal abilities and primary outcomes are stronger on average by nonwhite race, worse health, older age, and less education (p < .005). Empirical data confirm the hypothesized ACTIVE conceptual model. Findings suggest that the types of people who show intervention effects on cognitive performance potentially may be different from those with the greatest chance of transfer to real-world activities.

  12. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    NASA Astrophysics Data System (ADS)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  13. A nursing-specific model of EPR documentation: organizational and professional requirements.

    PubMed

    von Krogh, Gunn; Nåden, Dagfinn

    2008-01-01

    To present the Norwegian documentation KPO model (quality assurance, problem solving, and caring). To present the requirements and multiple electronic patient record (EPR) functions the model is designed to address. The model's professional substance, a conceptual framework for nursing practice is developed by examining, reorganizing, and completing existing frameworks. The model's methodology, an information management system, is developed using an expert group. Both model elements were clinically tested over a period of 1 year. The model is designed for nursing documentation in step with statutory, organizational, and professional requirements. Complete documentation is arranged for by incorporating the Nursing Minimum Data Set. A systematic and comprehensive documentation is arranged for by establishing categories as provided in the model's framework domains. Consistent documentation is arranged for by incorporating NANDA-I Nursing Diagnoses, Nursing Intervention Classification, and Nursing Outcome Classification. The model can be used as a tool in cooperation with vendors to ensure the interests of the nursing profession is met when developing EPR solutions in healthcare. The model can provide clinicians with a framework for documentation in step with legal and organizational requirements and at the same time retain the ability to record all aspects of clinical nursing.

  14. Multilevel modeling of single-case data: A comparison of maximum likelihood and Bayesian estimation.

    PubMed

    Moeyaert, Mariola; Rindskopf, David; Onghena, Patrick; Van den Noortgate, Wim

    2017-12-01

    The focus of this article is to describe Bayesian estimation, including construction of prior distributions, and to compare parameter recovery under the Bayesian framework (using weakly informative priors) and the maximum likelihood (ML) framework in the context of multilevel modeling of single-case experimental data. Bayesian estimation results were found similar to ML estimation results in terms of the treatment effect estimates, regardless of the functional form and degree of information included in the prior specification in the Bayesian framework. In terms of the variance component estimates, both the ML and Bayesian estimation procedures result in biased and less precise variance estimates when the number of participants is small (i.e., 3). By increasing the number of participants to 5 or 7, the relative bias is close to 5% and more precise estimates are obtained for all approaches, except for the inverse-Wishart prior using the identity matrix. When a more informative prior was added, more precise estimates for the fixed effects and random effects were obtained, even when only 3 participants were included. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. A framework for predicting impacts on ecosystem services ...

    EPA Pesticide Factsheets

    Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. The framework introduced here represents an ongoing initiative supported by the National Institute of Mathematical and Biological Synthesis (NIMBioS; http://www.nimbi

  16. An Ontology and a Software Framework for Competency Modeling and Management

    ERIC Educational Resources Information Center

    Paquette, Gilbert

    2007-01-01

    The importance given to competency management is well justified. Acquiring new competencies is the central goal of any education or knowledge management process. Thus, it must be embedded in any software framework as an instructional engineering tool, to inform the runtime environment of the knowledge that is processed by actors, and their…

  17. MIS Development in Higher Education: A Framework for Systems Planning.

    ERIC Educational Resources Information Center

    St. John, Edward P.

    An institutional management systems development study examined the Management Information Systems (MIS) needs of 23 public institutions of higher education in Missouri. The result was a model framework for other institutions to develop MIS appropriate to their needs. One of five distinct structural development phases could be related to all…

  18. Three-dimensional hydrogeologic framework model for use with a steady-state numerical ground-water flow model of the Death Valley regional flow system, Nevada and California

    USGS Publications Warehouse

    Belcher, Wayne R.; Faunt, Claudia C.; D'Agnese, Frank A.

    2002-01-01

    The U.S. Geological Survey, in cooperation with the Department of Energy and other Federal, State, and local agencies, is evaluating the hydrogeologic characteristics of the Death Valley regional ground-water flow system. The ground-water flow system covers an area of about 100,000 square kilometers from latitude 35? to 38?15' North to longitude 115? to 118? West, with the flow system proper comprising about 45,000 square kilometers. The Death Valley regional ground-water flow system is one of the larger flow systems within the Southwestern United States and includes in its boundaries the Nevada Test Site, Yucca Mountain, and much of Death Valley. Part of this study includes the construction of a three-dimensional hydrogeologic framework model to serve as the foundation for the development of a steady-state regional ground-water flow model. The digital framework model provides a computer-based description of the geometry and composition of the hydrogeologic units that control regional flow. The framework model of the region was constructed by merging two previous framework models constructed for the Yucca Mountain Project and the Environmental Restoration Program Underground Test Area studies at the Nevada Test Site. The hydrologic characteristics of the region result from a currently arid climate and complex geology. Interbasinal regional ground-water flow occurs through a thick carbonate-rock sequence of Paleozoic age, a locally thick volcanic-rock sequence of Tertiary age, and basin-fill alluvium of Tertiary and Quaternary age. Throughout the system, deep and shallow ground-water flow may be controlled by extensive and pervasive regional and local faults and fractures. The framework model was constructed using data from several sources to define the geometry of the regional hydrogeologic units. These data sources include (1) a 1:250,000-scale hydrogeologic-map compilation of the region; (2) regional-scale geologic cross sections; (3) borehole information, and (4) gridded surfaces from a previous three-dimensional geologic model. In addition, digital elevation model data were used in conjunction with these data to define ground-surface altitudes. These data, properly oriented in three dimensions by using geographic information systems, were combined and gridded to produce the upper surfaces of the hydrogeologic units used in the flow model. The final geometry of the framework model is constructed as a volumetric model by incorporating the intersections of these gridded surfaces and by applying fault truncation rules to structural features from the geologic map and cross sections. The cells defining the geometry of the hydrogeologic framework model can be assigned several attributes such as lithology, hydrogeologic unit, thickness, and top and bottom altitudes.

  19. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  20. Framework model and principles for trusted information sharing in pervasive health.

    PubMed

    Ruotsalainen, Pekka; Blobel, Bernd; Nykänen, Pirkko; Seppälä, Antto; Sorvari, Hannu

    2011-01-01

    Trustfulness (i.e. health and wellness information is processed ethically, and privacy is guaranteed) is one of the cornerstones for future Personal Health Systems, ubiquitous healthcare and pervasive health. Trust in today's healthcare is organizational, static and predefined. Pervasive health takes place in an open and untrusted information space where person's lifelong health and wellness information together with contextual data are dynamically collected and used by many stakeholders. This generates new threats that do not exist in today's eHealth systems. Our analysis shows that the way security and trust are implemented in today's healthcare cannot guarantee information autonomy and trustfulness in pervasive health. Based on a framework model of pervasive health and risks analysis of ubiquitous information space, we have formulated principles which enable trusted information sharing in pervasive health. Principles imply that the data subject should have the right to dynamically verify trust and to control the use of her health information, as well as the right to set situation based context-aware personal policies. Data collectors and processors have responsibilities including transparency of information processing, and openness of interests, policies and environmental features. Our principles create a base for successful management of privacy and information autonomy in pervasive health. They also imply that it is necessary to create new data models for personal health information and new architectures which support situation depending trust and privacy management.

  1. Query Expansion and Query Translation as Logical Inference.

    ERIC Educational Resources Information Center

    Nie, Jian-Yun

    2003-01-01

    Examines query expansion during query translation in cross language information retrieval and develops a general framework for inferential information retrieval in two particular contexts: using fuzzy logic and probability theory. Obtains evaluation formulas that are shown to strongly correspond to those used in other information retrieval models.…

  2. Blending Student Technology Experiences in Formal and Informal Learning

    ERIC Educational Resources Information Center

    Lai, K.-W.; Khaddage, F.; Knezek, Gerald

    2013-01-01

    In this article, we discuss the importance of recognizing students' technology-enhanced informal learning experiences and develop pedagogies to connect students' formal and informal learning experiences, in order to meet the demands of the knowledge society. The Mobile-Blended Collaborative Learning model is proposed as a framework to…

  3. Information Dissemination and Technology Transfer in Telecommunications.

    ERIC Educational Resources Information Center

    Roderer, Nancy K.; King, Donald W.

    Using a model of scientific and technical information transfer as a framework, this document focuses on four types of activities: the generation or authorship of telecommunications information and its publication, distribution, and use. Different forms of publication are considered in each functional area, though primary emphasis is on the…

  4. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    ERIC Educational Resources Information Center

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  5. High Level Information Fusion (HLIF) with nested fusion loops

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  6. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  7. Computational framework to support integration of biomolecular and clinical data within a translational approach.

    PubMed

    Miyoshi, Newton Shydeo Brandão; Pinheiro, Daniel Guariz; Silva, Wilson Araújo; Felipe, Joaquim Cezar

    2013-06-06

    The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. We have implemented an extension of Chado - the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different "omics" technologies with patient's clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans.

  8. Developmental and Life-Stage Physiologically-Based Pharmacokinetic (PBPK) Models in Humans and Animal Models.

    EPA Science Inventory

    PBPK models provide a computational framework for incorporating pertinent physiological and biochemical information to estimate in vivo levels of xenobiotics in biological tissues. In general, PBPK models are used to correlate exposures to target tissue levels of chemicals and th...

  9. Multiscale analysis of information dynamics for linear multivariate processes.

    PubMed

    Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele

    2016-08-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.

  10. Implicit kernel sparse shape representation: a sparse-neighbors-based objection segmentation framework.

    PubMed

    Yao, Jincao; Yu, Huimin; Hu, Roland

    2017-01-01

    This paper introduces a new implicit-kernel-sparse-shape-representation-based object segmentation framework. Given an input object whose shape is similar to some of the elements in the training set, the proposed model can automatically find a cluster of implicit kernel sparse neighbors to approximately represent the input shape and guide the segmentation. A distance-constrained probabilistic definition together with a dualization energy term is developed to connect high-level shape representation and low-level image information. We theoretically prove that our model not only derives from two projected convex sets but is also equivalent to a sparse-reconstruction-error-based representation in the Hilbert space. Finally, a "wake-sleep"-based segmentation framework is applied to drive the evolutionary curve to recover the original shape of the object. We test our model on two public datasets. Numerical experiments on both synthetic images and real applications show the superior capabilities of the proposed framework.

  11. Data governance for health care providers.

    PubMed

    Andronis, Katerina; Moysey, Kevin

    2013-01-01

    Data governance is characterised from broader definitions of governance. These characteristics are then mapped to a framework that provides a practical representation of the concepts. This representation is further developed with operating models and roles. Several information related scenarios covering both clinical and non-clinical domains are considered in information terms and then related back to the data governance framework. This assists the reader in understanding how data governance would help address the issues or achieve a better outcome. These elements together enable the reader to gain an understanding of the data governance framework and how it applies in practice. Finally, some practical advice is offered for establishing and operating data governance as well as approaches for justifying the investment.

  12. Fusion of Imperfect Information in the Unified Framework of Random Sets Theory: Application to Target Identification

    DTIC Science & Technology

    2007-11-01

    Florea, Anne-Laure Jousselme, Éloi Bossé ; DRDC Valcartier TR 2003-319 ; R & D pour la défense Canada – Valcartier ; novembre 2007. Contexte : Pour...12 3.3.2 Imprecise information . . . . . . . . . . . . . . . . . . . . . 13 3.3.3 Uncertain and imprecise information...information proposed by Philippe Smets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 Figure 5: The process of information modelling

  13. A framework for predicting impacts on ecosystem services from (sub)organismal responses to chemicals.

    PubMed

    Forbes, Valery E; Salice, Chris J; Birnir, Bjorn; Bruins, Randy J F; Calow, Peter; Ducrot, Virginie; Galic, Nika; Garber, Kristina; Harvey, Bret C; Jager, Henriette; Kanarek, Andrew; Pastorok, Robert; Railsback, Steve F; Rebarber, Richard; Thorbek, Pernille

    2017-04-01

    Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. Environ Toxicol Chem 2017;36:845-859. © 2017 SETAC. © 2017 SETAC.

  14. A Model Designed to Enhance Informed Consent: Experiences From the HIV Prevention Trials Network

    PubMed Central

    Woodsong, Cynthia; Karim, Quarraisha Abdool

    2005-01-01

    HIV prevention research in developing countries has resulted in increased attention to and discussion of ethical issues, particularly the issue of the quality of informed consent. We present a conceptual framework for an enhanced informed consent process, drawing on experiences garnered from domestic and international studies conducted by the HIV Prevention Trials Network, funded by the National Institutes of Health. This framework guides the development of an informed consent process designed to help ensure initial and continued comprehension of research participation, with an emphasis on HIV prevention research. Attention is focused at the individual and community levels and on 3 study phases: preenrollment, enrollment, and postenrollment. PMID:15727968

  15. IMAGINE: Interstellar MAGnetic field INference Engine

    NASA Astrophysics Data System (ADS)

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  16. Modular GIS Framework for National Scale Hydrologic and Hydraulic Modeling Support

    NASA Astrophysics Data System (ADS)

    Djokic, D.; Noman, N.; Kopp, S.

    2015-12-01

    Geographic information systems (GIS) have been extensively used for pre- and post-processing of hydrologic and hydraulic models at multiple scales. An extensible GIS-based framework was developed for characterization of drainage systems (stream networks, catchments, floodplain characteristics) and model integration. The framework is implemented as a set of free, open source, Python tools and builds on core ArcGIS functionality and uses geoprocessing capabilities to ensure extensibility. Utilization of COTS GIS core capabilities allows immediate use of model results in a variety of existing online applications and integration with other data sources and applications.The poster presents the use of this framework to downscale global hydrologic models to local hydraulic scale and post process the hydraulic modeling results and generate floodplains at any local resolution. Flow forecasts from ECMWF or WRF-Hydro are downscaled and combined with other ancillary data for input into the RAPID flood routing model. RAPID model results (stream flow along each reach) are ingested into a GIS-based scale dependent stream network database for efficient flow utilization and visualization over space and time. Once the flows are known at localized reaches, the tools can be used to derive the floodplain depth and extent for each time step in the forecast at any available local resolution. If existing rating curves are available they can be used to relate the flow to the depth of flooding, or synthetic rating curves can be derived using the tools in the toolkit and some ancillary data/assumptions. The results can be published as time-enabled spatial services to be consumed by web applications that use floodplain information as an input. Some of the existing online presentation templates can be easily combined with available online demographic and infrastructure data to present the impact of the potential floods on the local community through simple, end user products. This framework has been successfully used in both the data rich environments as well as in locales with minimum available spatial and hydrographic data.

  17. Technology for a Purpose: Technology for Information Problem-Solving with the Big6[R].

    ERIC Educational Resources Information Center

    Eisenberg, Mike B

    2003-01-01

    Explains the Big6 model of information problem solving as a conceptual framework for learning and teaching information and technology skills. Highlights include information skills; examples of integrating technology in Big6 contexts; and the Big6 and the Internet, including email, listservs, chat, Web browsers, search engines, portals, Web…

  18. Boosting Probabilistic Graphical Model Inference by Incorporating Prior Knowledge from Multiple Sources

    PubMed Central

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291

  19. A framework for the evaluation of patient information leaflets

    PubMed Central

    Garner, Mark; Ning, Zhenye; Francis, Jill

    2011-01-01

    Abstract Background  The provision of patient information leaflets (PILs) is an important part of health care. PILs require evaluation, but the frameworks that are used for evaluation are largely under‐informed by theory. Most evaluation to date has been based on indices of readability, yet several writers argue that readability is not enough. We propose a framework for evaluating PILs that reflect the central role of the patient perspective in communication and use methods for evaluation based on simple linguistic principles. The proposed framework  The framework has three elements that give rise to three approaches to evaluation. Each element is a necessary but not sufficient condition for effective communication. Readability (focussing on text) may be assessed using existing well‐established procedures. Comprehensibility (focussing on reader and text) may be assessed using multiple‐choice questions based on the lexical and semantic features of the text. Communicative effectiveness (focussing on reader) explores the relationship between the emotional, cognitive and behavioural responses of the reader and the objectives of the PIL. Suggested methods for assessment are described, based on our preliminary empirical investigations. Conclusions  The tripartite model of communicative effectiveness is a patient‐centred framework for evaluating PILs. It may assist the field in moving beyond readability to broader indicators of the quality and appropriateness of printed information provided to patients. PMID:21332620

  20. Vulnerability Assessment of Water Supply Systems: Status, Gaps and Opportunities

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2015-12-01

    Conventional frameworks for assessing the impacts of climate change on water resource systems use cascades of climate and hydrological models to provide 'top-down' projections of future water availability, but these are subject to high uncertainty and are model and scenario-specific. Hence there has been recent interest in 'bottom-up' frameworks, which aim to evaluate system vulnerability to change in the context of possible future climate and/or hydrological conditions. Such vulnerability assessments are generic, and can be combined with updated information from top-down assessments as they become available. While some vulnerability methods use hydrological models to estimate water availability, fully bottom-up schemes have recently been proposed that directly map system vulnerability as a function of feasible changes in water supply characteristics. These use stochastic algorithms, based on reconstruction or reshuffling methods, by which multiple water supply realizations can be generated under feasible ranges of change in water supply conditions. The paper reports recent successes, and points to areas of future improvement. Advances in stochastic modeling and optimization can address some technical limitations in flow reconstruction, while various data mining and system identification techniques can provide possibilities to better condition realizations for consistency with top-down scenarios. Finally, we show that probabilistic and Bayesian frameworks together can provide a potential basis to combine information obtained from fully bottom-up analyses with projections available from climate and/or hydrological models in a fully integrated risk assessment framework for deep uncertainty.

  1. A proposed analytic framework for determining the impact of an antimicrobial resistance intervention.

    PubMed

    Grohn, Yrjo T; Carson, Carolee; Lanzas, Cristina; Pullum, Laura; Stanhope, Michael; Volkova, Victoriya

    2017-06-01

    Antimicrobial use (AMU) is increasingly threatened by antimicrobial resistance (AMR). The FDA is implementing risk mitigation measures promoting prudent AMU in food animals. Their evaluation is crucial: the AMU/AMR relationship is complex; a suitable framework to analyze interventions is unavailable. Systems science analysis, depicting variables and their associations, would help integrate mathematics/epidemiology to evaluate the relationship. This would identify informative data and models to evaluate interventions. This National Institute for Mathematical and Biological Synthesis AMR Working Group's report proposes a system framework to address the methodological gap linking livestock AMU and AMR in foodborne bacteria. It could evaluate how AMU (and interventions) impact AMR. We will evaluate pharmacokinetic/dynamic modeling techniques for projecting AMR selection pressure on enteric bacteria. We study two methods to model phenotypic AMR changes in bacteria in the food supply and evolutionary genotypic analyses determining molecular changes in phenotypic AMR. Systems science analysis integrates the methods, showing how resistance in the food supply is explained by AMU and concurrent factors influencing the whole system. This process is updated with data and techniques to improve prediction and inform improvements for AMU/AMR surveillance. Our proposed framework reflects both the AMR system's complexity, and desire for simple, reliable conclusions.

  2. Detecting Anomalous Insiders in Collaborative Information Systems

    PubMed Central

    Chen, You; Nyemba, Steve; Malin, Bradley

    2012-01-01

    Collaborative information systems (CISs) are deployed within a diverse array of environments that manage sensitive information. Current security mechanisms detect insider threats, but they are ill-suited to monitor systems in which users function in dynamic teams. In this paper, we introduce the community anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on the access logs of collaborative environments. The framework is based on the observation that typical CIS users tend to form community structures based on the subjects accessed (e.g., patients’ records viewed by healthcare providers). CADS consists of two components: 1) relational pattern extraction, which derives community structures and 2) anomaly prediction, which leverages a statistical model to determine when users have sufficiently deviated from communities. We further extend CADS into MetaCADS to account for the semantics of subjects (e.g., patients’ diagnoses). To empirically evaluate the framework, we perform an assessment with three months of access logs from a real electronic health record (EHR) system in a large medical center. The results illustrate our models exhibit significant performance gains over state-of-the-art competitors. When the number of illicit users is low, MetaCADS is the best model, but as the number grows, commonly accessed semantics lead to hiding in a crowd, such that CADS is more prudent. PMID:24489520

  3. Integration Framework of Process Planning based on Resource Independent Operation Summary to Support Collaborative Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo

    2004-06-01

    In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less

  4. A Conceptual Framework and Principles for Trusted Pervasive Health

    PubMed Central

    Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli

    2012-01-01

    Background Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept—pervasive health—which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. Objective This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and polices which can make pervasive health trustworthy. Methods In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. Results In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. Conclusions The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed. PMID:22481297

  5. A conceptual framework and principles for trusted pervasive health.

    PubMed

    Ruotsalainen, Pekka Sakari; Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli

    2012-04-06

    Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept-pervasive health-which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and policies which can make pervasive health trustworthy. In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed.

  6. Solving Homeland Security’s Wicked Problems: A Design Thinking Approach

    DTIC Science & Technology

    2015-09-01

    spur solutions. This thesis provides a framework for how S&T can incorporate design- thinking principles that are working well in other domains to...to spur solutions. This thesis provides a framework for how S&T can incorporate design-thinking principles that are working well in other domains to...Galbraith’s Star Model was used to analyze how DHS S&T, MindLab, and DARPA apply design-thinking principles to inform the framework to apply and

  7. Implementing Accountable Care Organizations: Lessons From a Qualitative Analysis of Four Private Sector Organizations.

    PubMed

    Walker, Daniel M; Hefner, Jennifer L; Sova, Lindsey N; Hilligoss, Brian; Song, Paula H; McAlearney, Ann Scheck

    Accountable care organizations (ACOs) are emerging across the healthcare marketplace and now include Medicare, Medicaid, and private sector payers covering more than 24 million lives. However, little is known about the process of organizational change required to achieve cost savings and quality improvements from the ACO model. This study applies the complex innovation implementation framework to understand the challenges and facilitators associated with the ACO implementation process. We conducted four case studies of private sector ACOs, selected to achieve variation in terms of geography and organizational maturity. Across sites, we used semistructured interviews with 68 key informants to elicit information regarding ACO implementation. Our analysis found challenges and facilitators across all domains in the conceptual framework. Notably, our findings deviated from the framework in two ways. First, findings from the financial resource availability domain revealed both financial and nonfinancial (i.e., labor) resources that contributed to implementation effectiveness. Second, a new domain, patient engagement, emerged as an important factor in implementation effectiveness. We present these deviations in an adapted framework. As the ACO model proliferates, these findings can support implementation efforts, and they highlight the importance of focusing on patients throughout the process. Importantly, this study extends the complex innovation implementation framework to incorporate consumers into the implementation framework, making it more patient centered and aiding future efforts.

  8. CONCEPTUAL MODEL DEVELOPMENT AND INFORMATION MANAGEMENT FRAMEWORK FOR DIAGNOSTICS RESEARCH

    EPA Science Inventory

    Conceptual model development will focus on the effects of habitat alteration, nutrients,suspended and bedded sediments, and toxic chemicals on appropriate endpoints (individuals, populations, communities, ecosystems) across spatial scales (habitats, water body, watershed, region)...

  9. A Framework for Modeling Emerging Diseases to Inform Management

    PubMed Central

    Katz, Rachel A.; Richgels, Katherine L.D.; Walsh, Daniel P.; Grant, Evan H.C.

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge. PMID:27983501

  10. A Framework for Modeling Emerging Diseases to Inform Management.

    PubMed

    Russell, Robin E; Katz, Rachel A; Richgels, Katherine L D; Walsh, Daniel P; Grant, Evan H C

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  11. A framework for modeling emerging diseases to inform management

    USGS Publications Warehouse

    Russell, Robin E.; Katz, Rachel A.; Richgels, Katherine L. D.; Walsh, Daniel P.; Grant, Evan H. Campbell

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  12. Using a theoretical framework to investigate whether the HIV/AIDS information needs of the AfroAIDSinfo Web portal members are met: a South African eHealth study.

    PubMed

    Van Zyl, Hendra; Kotze, Marike; Laubscher, Ria

    2014-03-28

    eHealth has been identified as a useful approach to disseminate HIV/AIDS information. Together with Consumer Health Informatics (CHI), the Web-to-Public Knowledge Transfer Model (WPKTM) has been applied as a theoretical framework to identify consumer needs for AfroAIDSinfo, a South African Web portal. As part of the CHI practice, regular eSurveys are conducted to determine whether these needs are changing and are continually being met. eSurveys show high rates of satisfaction with the content as well as the modes of delivery. The nature of information is thought of as reliable to reuse; both for education and for referencing of information. Using CHI and the WPKTM as a theoretical framework, it ensures that needs of consumers are being met and that they find the tailored methods of presenting the information agreeable. Combining ICTs and theories in eHealth interventions, this approach can be expanded to deliver information in other sectors of public health.

  13. Using a Theoretical Framework to Investigate Whether the HIV/AIDS Information Needs of the AfroAIDSinfo Web Portal Members Are Met: A South African eHealth Study

    PubMed Central

    Van Zyl, Hendra; Kotze, Marike; Laubscher, Ria

    2014-01-01

    eHealth has been identified as a useful approach to disseminate HIV/AIDS information. Together with Consumer Health Informatics (CHI), the Web-to-Public Knowledge Transfer Model (WPKTM) has been applied as a theoretical framework to identify consumer needs for AfroAIDSinfo, a South African Web portal. As part of the CHI practice, regular eSurveys are conducted to determine whether these needs are changing and are continually being met. eSurveys show high rates of satisfaction with the content as well as the modes of delivery. The nature of information is thought of as reliable to reuse; both for education and for referencing of information. Using CHI and the WPKTM as a theoretical framework, it ensures that needs of consumers are being met and that they find the tailored methods of presenting the information agreeable. Combining ICTs and theories in eHealth interventions, this approach can be expanded to deliver information in other sectors of public health. PMID:24686487

  14. Effective information channels for reducing costs of environmentally- friendly technologies: evidence from residential PV markets

    NASA Astrophysics Data System (ADS)

    Rai, Varun; Robinson, Scott A.

    2013-03-01

    Realizing the environmental benefits of solar photovoltaics (PV) will require reducing costs associated with perception, informational gaps and technological uncertainties. To identify opportunities to decrease costs associated with residential PV adoption, in this letter we use multivariate regression models to analyze a unique, household-level dataset of PV adopters in Texas (USA) to systematically quantify the effect of different information channels on aspiring PV adopters’ decision-making. We find that the length of the decision period depends on the business model, such as whether the system was bought or leased, and on special opportunities to learn, such as the influence of other PV owners in the neighborhood. This influence accrues passively through merely witnessing PV systems in the neighborhood, increasing confidence and motivation, as well as actively through peer-to-peer communications. Using these insights we propose a new framework to provide public information on PV that could drastically reduce barriers to PV adoption, thereby accelerating its market penetration and environmental benefits. This framework could also serve as a model for other distributed generation technologies.

  15. Generalized information fusion and visualization using spatial voting and data modeling

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James W.

    2013-05-01

    We present a novel and innovative information fusion and visualization framework for multi-source intelligence (multiINT) data using Spatial Voting (SV) and Data Modeling. We describe how different sources of information can be converted into numerical form for further processing downstream, followed by a short description of how this information can be fused using the SV grid. As an illustrative example, we show the modeling of cyberspace as cyber layers for the purpose of tracking cyber personas. Finally we describe a path ahead for creating interactive agile networks through defender customized Cyber-cubes for network configuration and attack visualization.

  16. Model architecture of intelligent data mining oriented urban transportation information

    NASA Astrophysics Data System (ADS)

    Yang, Bogang; Tao, Yingchun; Sui, Jianbo; Zhang, Feizhou

    2007-06-01

    Aiming at solving practical problems in urban traffic, the paper presents model architecture of intelligent data mining from hierarchical view. With artificial intelligent technologies used in the framework, the intelligent data mining technology improves, which is more suitable for the change of real-time road condition. It also provides efficient technology support for the urban transport information distribution, transmission and display.

  17. Stochastic Online Learning in Dynamic Networks under Unknown Models

    DTIC Science & Technology

    2016-08-02

    Repeated Game with Incomplete Information, IEEE International Conference on Acoustics, Speech, and Signal Processing. 20-MAR-16, Shanghai, China...in a game theoretic framework for the application of multi-seller dynamic pricing with unknown demand models. We formulated the problem as an...infinitely repeated game with incomplete information and developed a dynamic pricing strategy referred to as Competitive and Cooperative Demand Learning

  18. Using Social Learning Theory to Reduce Small Business Breakdown along the Internet Superhighway: An Exploratory Model.

    ERIC Educational Resources Information Center

    Barker, Randolph T.; Sturm, Philip R.; Camarata, Michael

    1997-01-01

    Calls upon the innovators of the communications field to look in the rear view mirror to see the businesses left behind in the information expansion race. Puts forth a model utilizing social learning theory to define a framework for "road service," getting the small business "resister" up to the information superhighway speed…

  19. Highway extraction from high resolution aerial photography using a geometric active contour model

    NASA Astrophysics Data System (ADS)

    Niu, Xutong

    Highway extraction and vehicle detection are two of the most important steps in traffic-flow analysis from multi-frame aerial photographs. The traditional method of deriving traffic flow trajectories relies on manual vehicle counting from a sequence of aerial photographs, which is tedious and time-consuming. This research presents a new framework for semi-automatic highway extraction. The basis of the new framework is an improved geometric active contour (GAC) model. This novel model seeks to minimize an objective function that transforms a problem of propagation of regular curves into an optimization problem. The implementation of curve propagation is based on level set theory. By using an implicit representation of a two-dimensional curve, a level set approach can be used to deal with topological changes naturally, and the output is unaffected by different initial positions of the curve. However, the original GAC model, on which the new model is based, only incorporates boundary information into the curve propagation process. An error-producing phenomenon called leakage is inevitable wherever there is an uncertain weak edge. In this research, region-based information is added as a constraint into the original GAC model, thereby, giving this proposed method the ability of integrating both boundary and region-based information during the curve propagation. Adding the region-based constraint eliminates the leakage problem. This dissertation applies the proposed augmented GAC model to the problem of highway extraction from high-resolution aerial photography. First, an optimized stopping criterion is designed and used in the implementation of the GAC model. It effectively saves processing time and computations. Second, a seed point propagation framework is designed and implemented. This framework incorporates highway extraction, tracking, and linking into one procedure. A seed point is usually placed at an end node of highway segments close to the boundary of the image or at a position where possible blocking may occur, such as at an overpass bridge or near vehicle crowds. These seed points can be automatically propagated throughout the entire highway network. During the process, road center points are also extracted, which introduces a search direction for solving possible blocking problems. This new framework has been successfully applied to highway network extraction from a large orthophoto mosaic. In the process, vehicles on the highway extracted from mosaic were detected with an 83% success rate.

  20. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  1. Soliciting scientific information and beliefs in predictive modeling and adaptive management

    NASA Astrophysics Data System (ADS)

    Glynn, P. D.; Voinov, A. A.; Shapiro, C. D.

    2015-12-01

    Post-normal science requires public engagement and adaptive corrections in addressing issues with high complexity and uncertainty. An adaptive management framework is presented for the improved management of natural resources and environments through a public participation process. The framework solicits the gathering and transformation and/or modeling of scientific information but also explicitly solicits the expression of participant beliefs. Beliefs and information are compared, explicitly discussed for alignments or misalignments, and ultimately melded back together as a "knowledge" basis for making decisions. An effort is made to recognize the human or participant biases that may affect the information base and the potential decisions. In a separate step, an attempt is made to recognize and predict the potential "winners" and "losers" (perceived or real) of any decision or action. These "winners" and "losers" include present human communities with different spatial, demographic or socio-economic characteristics as well as more dispersed or more diffusely characterized regional or global communities. "Winners" and "losers" may also include future human communities as well as communities of other biotic species. As in any adaptive management framework, assessment of predictions, iterative follow-through and adaptation of policies or actions is essential, and commonly very difficult or impossible to achieve. Recognizing beforehand the limits of adaptive management is essential. More generally, knowledge of the behavioral and economic sciences and of ethics and sociology will be key to a successful implementation of this adaptive management framework. Knowledge of biogeophysical processes will also be essential, but by definition of the issues being addressed, will always be incomplete and highly uncertain. The human dimensions of the issues addressed and the participatory processes used carry their own complexities and uncertainties. Some ideas and principles are provided that may help guide and implement the proposed adaptive management framework and its public and stakeholder engagement processes. Examples and characteristics of issues that could be beneficially addressed through the proposed framework will also be presented.

  2. Developing a performance measurement framework and indicators for community health service facilities in urban China.

    PubMed

    Wong, Sabrina T; Yin, Delu; Bhattacharyya, Onil; Wang, Bin; Liu, Liqun; Chen, Bowen

    2010-11-18

    China has had no effective and systematic information system to provide guidance for strengthening PHC (Primary Health Care) or account to citizens on progress. We report on the development of the China results-based Logic Model for Community Health Facilities and Stations (CHS) and a set of relevant PHC indicators intended to measure CHS priorities. We adapted the PHC Results Based Logic Model developed in Canada and current work conducted in the community health system in China to create the China CHS Logic Model framework. We used a staged approach by first constructing the framework and indicators and then validating their content through an interactive process involving policy analysis, critical review of relevant literature and multiple stakeholder consultation. The China CHS Logic Model includes inputs, activities, outputs and outcomes with a total of 287 detailed performance indicators. In these indicators, 31 indicators measure inputs, 64 measure activities, 105 measure outputs, and 87 measure immediate (n = 65), intermediate (n = 15), or final (n = 7) outcomes. A Logic Model framework can be useful in planning, implementation, analysis and evaluation of PHC at a system and service level. The development and content validation of the China CHS Logic Model and subsequent indicators provides a means for stronger accountability and a clearer sense of overall direction and purpose needed to renew and strengthen the PHC system in China. Moreover, this work will be useful in moving towards developing a PHC information system and performance measurement across districts in urban China, and guiding the pursuit of quality in PHC.

  3. Developing a Performance Measurement Framework and Indicators for Community Health Service Facilities in Urban China

    PubMed Central

    2010-01-01

    Background China has had no effective and systematic information system to provide guidance for strengthening PHC (Primary Health Care) or account to citizens on progress. We report on the development of the China results-based Logic Model for Community Health Facilities and Stations (CHS) and a set of relevant PHC indicators intended to measure CHS priorities. Methods We adapted the PHC Results Based Logic Model developed in Canada and current work conducted in the community health system in China to create the China CHS Logic Model framework. We used a staged approach by first constructing the framework and indicators and then validating their content through an interactive process involving policy analysis, critical review of relevant literature and multiple stakeholder consultation. Results The China CHS Logic Model includes inputs, activities, outputs and outcomes with a total of 287 detailed performance indicators. In these indicators, 31 indicators measure inputs, 64 measure activities, 105 measure outputs, and 87 measure immediate (n = 65), intermediate (n = 15), or final (n = 7) outcomes. Conclusion A Logic Model framework can be useful in planning, implementation, analysis and evaluation of PHC at a system and service level. The development and content validation of the China CHS Logic Model and subsequent indicators provides a means for stronger accountability and a clearer sense of overall direction and purpose needed to renew and strengthen the PHC system in China. Moreover, this work will be useful in moving towards developing a PHC information system and performance measurement across districts in urban China, and guiding the pursuit of quality in PHC. PMID:21087516

  4. Reasons Preventing Teachers from Acting within the Framework of Ethical Principles

    ERIC Educational Resources Information Center

    Dag, Nilgün; Arslantas, Halis Adnan

    2015-01-01

    This study aims at putting forth the reasons preventing teachers from acting ethically, acting within the framework of ethical principles and having an ethical tendency. This study featuring a qualitative research model taking as a basis the case study approach followed a path of selecting people that can be a rich source of information for…

  5. Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework

    PubMed Central

    Nahum-Shani, Inbal; Hekler, Eric B.; Spruijt-Metz, Donna

    2016-01-01

    Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)--suites of interventions that adapt over time to an individual’s changing status and circumstances with the goal to address the individual’s need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. PMID:26651462

  6. A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.

    2004-12-01

    The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.

  7. Bayesian Knowledge Fusion in Prognostics and Health Management—A Case Study

    NASA Astrophysics Data System (ADS)

    Rabiei, Masoud; Modarres, Mohammad; Mohammad-Djafari, Ali

    2011-03-01

    In the past few years, a research effort has been in progress at University of Maryland to develop a Bayesian framework based on Physics of Failure (PoF) for risk assessment and fleet management of aging airframes. Despite significant achievements in modelling of crack growth behavior using fracture mechanics, it is still of great interest to find practical techniques for monitoring the crack growth instances using nondestructive inspection and to integrate such inspection results with the fracture mechanics models to improve the predictions. The ultimate goal of this effort is to develop an integrated probabilistic framework for utilizing all of the available information to come up with enhanced (less uncertain) predictions for structural health of the aircraft in future missions. Such information includes material level fatigue models and test data, health monitoring measurements and inspection field data. In this paper, a case study of using Bayesian fusion technique for integrating information from multiple sources in a structural health management problem is presented.

  8. Development of a Conceptual Framework to Measure the Social Impact of Burns.

    PubMed

    Marino, Molly; Soley-Bori, Marina; Jette, Alan M; Slavin, Mary D; Ryan, Colleen M; Schneider, Jeffrey C; Resnik, Linda; Acton, Amy; Amaya, Flor; Rossi, Melinda; Soria-Saucedo, Rene; Kazis, Lewis E

    Measuring community reintegration following burn injury is important to assess the efficacy of therapies designed to optimize recovery. This project aims to develop and validate a conceptual framework for understanding the social impact of burn injuries in adults. The framework is critical for developing the item banks used for a computerized adaptive test. We performed a comprehensive literature review and consulted with clinical experts and burn survivors about social life areas impacted by burn injury. Focus groups with burn survivors and clinicians were conducted to inform and validate the framework. Transcripts were coded using grounded theory methodology. The World Health Organization's International Classification of Functioning, Disability and Health, was chosen to ground the content model. The primary construct identified was social participation, which contains two concepts: societal role and personal relationships. The subdomains chosen for item development were work, recreation and leisure, relating with strangers, and romantic, sexual, family, and informal relationships. Qualitative results strongly suggest that the conceptual model fits the constructs for societal role and personal relationships with the respective subdomains. This conceptual framework has guided the implementation of a large-scale calibration study currently underway which will lead to a computerized adaptive test for monitoring the social impacts of burn injuries during recovery.

  9. Information Technology for Schools: Creating Practical Knowledge To Improve Student Performance. The Jossey-Bass Education Series.

    ERIC Educational Resources Information Center

    Kallick, Bena, Ed.; Wilson, James M., III, Ed.

    This book chronicles practitioners' struggles in implementing information technology, identifies the existing barriers to implementation, and provides a set of frameworks from the current understanding of this process to support learning through information creation. The chapters are: chapter 1, "A Model for Organizational Learning: The…

  10. Building a Foundation to Study Distributed Information Behaviour

    ERIC Educational Resources Information Center

    von Thaden, Terry L.

    2007-01-01

    Introduction: The purpose of this research is to assess information behaviour as it pertains to operational teams in dynamic safety critical operations. Method: In this paper, I describe some of the problems faced by crews on modern flight decks and suggest a framework modelled on Information Science, Human Factors, and Activity Theory research to…

  11. Acceptance Factors Influencing Adoption of National Institute of Standards and Technology Information Security Standards: A Quantitative Study

    ERIC Educational Resources Information Center

    Kiriakou, Charles M.

    2012-01-01

    Adoption of a comprehensive information security governance model and security controls is the best option organizations may have to protect their information assets and comply with regulatory requirements. Understanding acceptance factors of the National Institute of Standards and Technology (NIST) Risk Management Framework (RMF) comprehensive…

  12. Vulnerable Populations in Hospital and Health Care Emergency Preparedness Planning: A Comprehensive Framework for Inclusion.

    PubMed

    Kreisberg, Debra; Thomas, Deborah S K; Valley, Morgan; Newell, Shannon; Janes, Enessa; Little, Charles

    2016-04-01

    As attention to emergency preparedness becomes a critical element of health care facility operations planning, efforts to recognize and integrate the needs of vulnerable populations in a comprehensive manner have lagged. This not only results in decreased levels of equitable service, but also affects the functioning of the health care system in disasters. While this report emphasizes the United States context, the concepts and approaches apply beyond this setting. This report: (1) describes a conceptual framework that provides a model for the inclusion of vulnerable populations into integrated health care and public health preparedness; and (2) applies this model to a pilot study. The framework is derived from literature, hospital regulatory policy, and health care standards, laying out the communication and relational interfaces that must occur at the systems, organizational, and community levels for a successful multi-level health care systems response that is inclusive of diverse populations explicitly. The pilot study illustrates the application of key elements of the framework, using a four-pronged approach that incorporates both quantitative and qualitative methods for deriving information that can inform hospital and health facility preparedness planning. The conceptual framework and model, applied to a pilot project, guide expanded work that ultimately can result in methodologically robust approaches to comprehensively incorporating vulnerable populations into the fabric of hospital disaster preparedness at levels from local to national, thus supporting best practices for a community resilience approach to disaster preparedness.

  13. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.

  14. A Framework for Organizing Current and Future Electric Utility Regulatory and Business Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Satchwell, Andrew; Cappers, Peter; Schwartz, Lisa

    In this report, we will present a descriptive and organizational framework for incremental and fundamental changes to regulatory and utility business models in the context of clean energy public policy goals. We will also discuss the regulated utility's role in providing value-added services that relate to distributed energy resources, identify the "openness" of customer information and utility networks necessary to facilitate change, and discuss the relative risks, and the shifting of risks, for utilities and customers.

  15. Automatic classification of animal vocalizations

    NASA Astrophysics Data System (ADS)

    Clemins, Patrick J.

    2005-11-01

    Bioacoustics, the study of animal vocalizations, has begun to use increasingly sophisticated analysis techniques in recent years. Some common tasks in bioacoustics are repertoire determination, call detection, individual identification, stress detection, and behavior correlation. Each research study, however, uses a wide variety of different measured variables, called features, and classification systems to accomplish these tasks. The well-established field of human speech processing has developed a number of different techniques to perform many of the aforementioned bioacoustics tasks. Melfrequency cepstral coefficients (MFCCs) and perceptual linear prediction (PLP) coefficients are two popular feature sets. The hidden Markov model (HMM), a statistical model similar to a finite autonoma machine, is the most commonly used supervised classification model and is capable of modeling both temporal and spectral variations. This research designs a framework that applies models from human speech processing for bioacoustic analysis tasks. The development of the generalized perceptual linear prediction (gPLP) feature extraction model is one of the more important novel contributions of the framework. Perceptual information from the species under study can be incorporated into the gPLP feature extraction model to represent the vocalizations as the animals might perceive them. By including this perceptual information and modifying parameters of the HMM classification system, this framework can be applied to a wide range of species. The effectiveness of the framework is shown by analyzing African elephant and beluga whale vocalizations. The features extracted from the African elephant data are used as input to a supervised classification system and compared to results from traditional statistical tests. The gPLP features extracted from the beluga whale data are used in an unsupervised classification system and the results are compared to labels assigned by experts. The development of a framework from which to build animal vocalization classifiers will provide bioacoustics researchers with a consistent platform to analyze and classify vocalizations. A common framework will also allow studies to compare results across species and institutions. In addition, the use of automated classification techniques can speed analysis and uncover behavioral correlations not readily apparent using traditional techniques.

  16. Translating Extreme Precipitation Data from Climate Change Projections into Resilient Engineering Applications

    NASA Astrophysics Data System (ADS)

    Cook, L. M.; Samaras, C.; Anderson, C.

    2016-12-01

    Engineers generally use historical precipitation trends to inform assumptions and parameters for long-lived infrastructure designs. However, resilient design calls for the adjustment of current engineering practice to incorporate a range of future climate conditions that are likely to be different than the past. Despite the availability of future projections from downscaled climate models, there remains a considerable mismatch between climate model outputs and the inputs needed in the engineering community to incorporate climate resiliency. These factors include differences in temporal and spatial scales, model uncertainties, and a lack of criteria for selection of an ensemble of models. This research addresses the limitations to working with climate data by providing a framework for the use of publicly available downscaled climate projections to inform engineering resiliency. The framework consists of five steps: 1) selecting the data source based on the engineering application, 2) extracting the data at a specific location, 3) validating for performance against observed data, 4) post-processing for bias or scale, and 5) selecting the ensemble and calculating statistics. The framework is illustrated with an example application to extreme precipitation-frequency statistics, the 25-year daily precipitation depth, using four publically available climate data sources: NARCCAP, USGS, Reclamation, and MACA. The attached figure presents the results for step 5 from the framework, analyzing how the 24H25Y depth changes when the model ensemble is culled based on model performance against observed data, for both post-processing techniques: bias-correction and change factor. Culling the model ensemble increases both the mean and median values for all data sources, and reduces range for NARCCAP and MACA ensembles due to elimination of poorer performing models, and in some cases, those that predict a decrease in future 24H25Y precipitation volumes. This result is especially relevant to engineers who wish to reduce the range of the ensemble and remove contradicting models; however, this result is not generalizable for all cases. Finally, this research highlights the need for the formation of an intermediate entity that is able to translate climate projections into relevant engineering information.

  17. A Framework for Analysis of Research Risks and Benefits to Participants in Standard of Care Pragmatic Clinical Trials

    PubMed Central

    Chen, Stephanie C; Kim, Scott Y H

    2016-01-01

    Background/Aims Standard of care pragmatic clinical trials (SCPCTs) that compare treatments already in use could improve care and reduce cost but there is considerable debate about the research risks of SCPCTs and how to apply informed consent regulations to such trials. We sought to develop a framework integrating the insights from opposing sides of the debate. Methods We developed a formal risk-benefit analysis framework for SCPCTs and then applied it to key provisions of the U.S. federal regulations. Results Our formal framework for SCPCT risk-benefit analysis takes into account three key considerations: the ex ante estimates of risks and benefits of the treatments to be compared in a SCPCT, the allocation ratios of treatments inside and outside a SCPCT, and the significance of some participants receiving a different treatment inside a SCPCT than outside the trial. The framework provides practical guidance on how the research ethics regulations on informed consent should be applied to SCPCTs. Conclusions Our proposed formal model makes explicit the relationship between the concepts used by opposing sides of the debate about the research risks of SCPCTs and can be used to clarify the implications for informed consent. PMID:27365010

  18. Open data models for smart health interconnected applications: the example of openEHR.

    PubMed

    Demski, Hans; Garde, Sebastian; Hildebrand, Claudia

    2016-10-22

    Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.

  19. The fusion of large scale classified side-scan sonar image mosaics.

    PubMed

    Reed, Scott; Tena, Ruiz Ioseba; Capus, Chris; Petillot, Yvan

    2006-07-01

    This paper presents a unified framework for the creation of classified maps of the seafloor from sonar imagery. Significant challenges in photometric correction, classification, navigation and registration, and image fusion are addressed. The techniques described are directly applicable to a range of remote sensing problems. Recent advances in side-scan data correction are incorporated to compensate for the sonar beam pattern and motion of the acquisition platform. The corrected images are segmented using pixel-based textural features and standard classifiers. In parallel, the navigation of the sonar device is processed using Kalman filtering techniques. A simultaneous localization and mapping framework is adopted to improve the navigation accuracy and produce georeferenced mosaics of the segmented side-scan data. These are fused within a Markovian framework and two fusion models are presented. The first uses a voting scheme regularized by an isotropic Markov random field and is applicable when the reliability of each information source is unknown. The Markov model is also used to inpaint regions where no final classification decision can be reached using pixel level fusion. The second model formally introduces the reliability of each information source into a probabilistic model. Evaluation of the two models using both synthetic images and real data from a large scale survey shows significant quantitative and qualitative improvement using the fusion approach.

  20. The Effect of Subsurface Parameterizations on Modeled Flows in the Catchment Land Surface Model, Fortuna 2.5

    NASA Astrophysics Data System (ADS)

    Roningen, J. M.; Eylander, J. B.

    2014-12-01

    Groundwater use and management is subject to economic, legal, technical, and informational constraints and incentives at a variety of spatial and temporal scales. Planned and de facto management practices influenced by tax structures, legal frameworks, and agricultural and trade policies that vary at the country scale may have medium- and long-term effects on the ability of a region to support current and projected agricultural and industrial development. USACE is working to explore and develop global-scale, physically-based frameworks to serve as a baseline for hydrologic policy comparisons and consequence assessment, and such frameworks must include a reasonable representation of groundwater systems. To this end, we demonstrate the effects of different subsurface parameterizations, scaling, and meteorological forcings on surface and subsurface components of the Catchment Land Surface Model Fortuna v2.5 (Koster et al. 2000). We use the Land Information System 7 (Kumar et al. 2006) to process model runs using meteorological components of the Air Force Weather Agency's AGRMET forcing data from 2006 through 2011. Seasonal patterns and trends are examined in areas of the Upper Nile basin, northern China, and the Mississippi Valley. We also discuss the relevance of the model's representation of the catchment deficit with respect to local hydrogeologic structures.

  1. The water-energy nexus at water supply and its implications on the integrated water and energy management.

    PubMed

    Khalkhali, Masoumeh; Westphal, Kirk; Mo, Weiwei

    2018-09-15

    Water and energy are highly interdependent in the modern world, and hence, it is important to understand their constantly changing and nonlinear interconnections to inform the integrated management of water and energy. In this study, a hydrologic model, a water systems model, and an energy model were developed and integrated into a system dynamics modeling framework. This framework was then applied to a water supply system in the northeast US to capture its water-energy interactions under a set of future population, climate, and system operation scenarios. A hydrologic model was first used to simulate the system's hydrologic inflows and outflows under temperature and precipitation changes on a weekly-basis. A water systems model that combines the hydrologic model and management rules (e.g., water release and transfer) was then developed to dynamically simulate the system's water storage and water head. Outputs from the water systems model were used in the energy model to estimate hydropower generation. It was found that critical water-energy synergies and tradeoffs exist, and there is a possibility for integrated water and energy management to achieve better outcomes. This analysis also shows the importance of a holistic understanding of the systems as a whole, which would allow utility managers to make proactive long-term management decisions. The modeling framework is generalizable to other water supply systems with hydropower generation capacities to inform the integrated management of water and energy resources. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Inferring the nature of anthropogenic threats from long-term abundance records.

    PubMed

    Shoemaker, Kevin T; Akçakaya, H Resit

    2015-02-01

    Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.

  3. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: •more » Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.« less

  4. A Conceptual Framework for Predicting Error in Complex Human-Machine Environments

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.

  5. The geospatial modeling interface (GMI) framework for deploying and assessing environmental models

    USDA-ARS?s Scientific Manuscript database

    Geographical information systems (GIS) software packages have been used for close to three decades as analytical tools in environmental management for geospatial data assembly, processing, storage, and visualization of input data and model output. However, with increasing availability and use of ful...

  6. Statement Verification: A Stochastic Model of Judgment and Response.

    ERIC Educational Resources Information Center

    Wallsten, Thomas S.; Gonzalez-Vallejo, Claudia

    1994-01-01

    A stochastic judgment model (SJM) is presented as a framework for addressing issues in statement verification and probability judgment. Results of 5 experiments with 264 undergraduates support the validity of the model and provide new information that is interpreted in terms of the SJM. (SLD)

  7. Using coupled hydrogeophysical models and data assimilation to enhance the information content in geoelectrical leak detection

    NASA Astrophysics Data System (ADS)

    Tso, C. H. M.; Johnson, T. C.; Song, X.; Chen, X.; Binley, A. M.

    2017-12-01

    Time-lapse electrical resistivity tomography (ERT) measurements provides indirect observation of hydrological processes in the Earth's shallow subsurface at high spatial and temporal resolutions. ERT has been used for a number of decades to detect leaks and monitor the evolution of associated contaminant plumes. However, this has been limited to a few hazardous environmental sites. Furthermore, an assessment of uncertainty in such applications has thus far been neglected, despite the clear need to provide site managers with appropriate information for decision making purposes. There is a need to establish a framework that allows leak detection with uncertainty assessment from geophysical observations. Ideally such a framework should allow the incorporation of additional data sources in order to reduce uncertainty in predictions. To tackle these issues, we propose an ensemble-based data assimilation framework that evaluates proposed hydrological models (i.e. different hydrogeological units, different leak locations and loads) against observed time-lapse ERT measurements. Each proposed hydrological model is run through the parallel coupled hydrogeophysical code PFLOTRAN-E4D (Johnson et al 2016) to obtain simulated ERT measurements. The ensemble of model proposals is then updated based on data misfit. Our approach does not focus on obtaining detailed images of hydraulic properties or plume movement. Rather, it seeks to estimate the contaminant mass discharge (CMD) across a user-defined plane in space probabilistically. The proposed approach avoids the ambiguity in interpreting detailed hydrological processes from geophysical images. The resultant distributions of CMD give a straightforward metric, with realistic uncertainty bounds, for decision making. The proposed framework is also computationally efficient so that it can exploit large, long-term ERT datasets, making it possible to track time-varying loadings of plume sources. In this presentation, we illustrate our framework on synthetic data and field data collected from an ERT trial simulating a leak at the Sellafield nuclear facility in the UK (Kuras et al 2016). We compare our results to interpretation from geophysical inversion and discuss the additional information that hydrological model proposals provide.

  8. Toward a Hedonic Value Framework in Health Care.

    PubMed

    Basu, Anirban; Sullivan, Sean D

    2017-02-01

    In well-functioning markets, a hedonic pricing scheme can reflect the marginal valuation of various attributes of a differentiated product at market equilibrium. It serves as an important tool to inform pricing of a new product with a specific combination of attributes. Because health cannot be bought and sold in a market setting, and health care markets are distorted by insurance or government subsidies, direct valuation of a health intervention as a differentiated good through observed market prices is difficult. In this article, we discuss the rationale of using stated preference methods for developing a hedonic value framework for health insurance products to inform the decision on whether a product should be covered or subsidized by insurance, given its price. This value index will not reflect marginal value at market equilibrium, as in a hedonic pricing scheme, but would capture the distribution of marginal value in the population. We discuss how affordability concerns can be integrated into the development of a hedonic valuation model. We compare this framework with traditional cost-effectiveness analysis and also the existing value frameworks put forth by various organizations. The framework can be adopted to inform other decisions such as pricing. We argue that developing such a comprehensive and decision-theoretic value framework is feasible and, if successful, can serve to inform health care resource allocation in this country for decades to come in a systematic manner. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  9. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.

  10. Applying the Ottawa Charter to inform health promotion programme design.

    PubMed

    Fry, Denise; Zask, Avigdor

    2017-10-01

    There is evidence of a correlation between adoption of the Ottawa Charter's framework of five action areas and health promotion programme effectiveness, but the Charter's framework has not been as fully implemented as hoped, nor is generally used by formal programme design models. In response, we aimed to translate the Charter's framework into a method to inform programme design. Our resulting design process uses detailed definitions of the Charter's action areas and evidence of predicted effectiveness to prompt greater consideration and use of the Charter's framework. We piloted the process by applying it to the design of four programmes of the Healthy Children's Initiative in New South Wales, Australia; refined the criteria via consensus; and made consensus decisions on the extent to which programme designs reflected the Charter's framework. The design process has broad potential applicability to health promotion programmes; facilitating greater use of the Ottawa Charter framework, which evidence indicates can increase programme effectiveness. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  12. Justification of automated decision-making: medical explanations as medical arguments.

    PubMed Central

    Shankar, R. D.; Musen, M. A.

    1999-01-01

    People use arguments to justify their claims. Computer systems use explanations to justify their conclusions. We are developing WOZ, an explanation framework that justifies the conclusions of a clinical decision-support system. WOZ's central component is the explanation strategy that decides what information justifies a claim. The strategy uses Toulmin's argument structure to define pieces of information and to orchestrate their presentation. WOZ uses explicit models that abstract the core aspects of the framework such as the explanation strategy. In this paper, we present the use of arguments, the modeling of explanations, and the explanation process used in WOZ. WOZ exploits the wealth of naturally occurring arguments, and thus can generate convincing medical explanations. Images Figure 5 PMID:10566388

  13. Treating adult survivors of childhood emotional abuse and neglect: A new framework.

    PubMed

    Grossman, Frances K; Spinazzola, Joseph; Zucker, Marla; Hopper, Elizabeth

    2017-01-01

    This article provides the outline of a new framework for treating adult survivors of childhood emotional abuse and neglect. Component-based psychotherapy (CBP) is an evidence-informed model that bridges, synthesizes, and expands upon several existing schools, or theories, of treatment for adult survivors of traumatic stress. These include approaches to therapy that stem from more classic traditions in psychology, such as psychoanalysis, to more modern approaches including those informed by feminist thought. Moreover, CBP places particular emphasis on integration of key concepts from evidence-based treatment models developed in the past few decades predicated upon thinking and research on the effects of traumatic stress and processes of recovery for survivors. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Frameworks for change in healthcare organisations: a formative evaluation of the NHS Change Model.

    PubMed

    Martin, Graham P; Sutton, Elizabeth; Willars, Janet; Dixon-Woods, Mary

    2013-08-01

    Organisational change in complex healthcare systems is a multifaceted process. The English National Health Service recently introduced a 'Change Model' that seeks to offer an evidence-based framework for guiding change. We report findings from a formative evaluation of the NHS Change Model and make recommendations for those developing the Model and its users. The evaluation involved 28 interviews with managers and clinicians making use of the Change Model in relation to a variety of projects. Interviews were fully transcribed and were analysed using an approach based on the Framework method. Participants saw the Change Model as valuable and practically useful. Fidelity to core principles of the Model was variable: participants often altered the Model, especially when using it to orchestrate the work of others. In challenging organisational contexts, the Change Model was sometimes used to delegitimise opposition rather than identify shared purpose among different interest groups. Those guiding change may benefit from frameworks, guidance and toolkits to structure and inform their planning and activities. Participants' experiences suggested the Change Model has much potential. Further work on its design and on supporting materials may optimise the approach, but its utility rests in particular on organisational cultures that support faithful application. © The Author(s) 2013 Reprints and permissions:]br]sagepub.co.uk/journalsPermissions.nav.

  15. Intelligence-aided multitarget tracking for urban operations - a case study: counter terrorism

    NASA Astrophysics Data System (ADS)

    Sathyan, T.; Bharadwaj, K.; Sinha, A.; Kirubarajan, T.

    2006-05-01

    In this paper, we present a framework for tracking multiple mobile targets in an urban environment based on data from multiple sources of information, and for evaluating the threat these targets pose to assets of interest (AOI). The motivating scenario is one where we have to track many targets, each with different (unknown) destinations and/or intents. The tracking algorithm is aided by information about the urban environment (e.g., road maps, buildings, hideouts), and strategic and intelligence data. The tracking algorithm needs to be dynamic in that it has to handle a time-varying number of targets and the ever-changing urban environment depending on the locations of the moving objects and AOI. Our solution uses the variable structure interacting multiple model (VS-IMM) estimator, which has been shown to be effective in tracking targets based on road map information. Intelligence information is represented as target class information and incorporated through a combined likelihood calculation within the VS-IMM estimator. In addition, we develop a model to calculate the probability that a particular target can attack a given AOI. This model for the calculation of the probability of attack is based on the target kinematic and class information. Simulation results are presented to demonstrate the operation of the proposed framework on a representative scenario.

  16. Discrete Element Framework for Modelling Extracellular Matrix, Deformable Cells and Subcellular Components

    PubMed Central

    Gardiner, Bruce S.; Wong, Kelvin K. L.; Joldes, Grand R.; Rich, Addison J.; Tan, Chin Wee; Burgess, Antony W.; Smith, David W.

    2015-01-01

    This paper presents a framework for modelling biological tissues based on discrete particles. Cell components (e.g. cell membranes, cell cytoskeleton, cell nucleus) and extracellular matrix (e.g. collagen) are represented using collections of particles. Simple particle to particle interaction laws are used to simulate and control complex physical interaction types (e.g. cell-cell adhesion via cadherins, integrin basement membrane attachment, cytoskeletal mechanical properties). Particles may be given the capacity to change their properties and behaviours in response to changes in the cellular microenvironment (e.g., in response to cell-cell signalling or mechanical loadings). Each particle is in effect an ‘agent’, meaning that the agent can sense local environmental information and respond according to pre-determined or stochastic events. The behaviour of the proposed framework is exemplified through several biological problems of ongoing interest. These examples illustrate how the modelling framework allows enormous flexibility for representing the mechanical behaviour of different tissues, and we argue this is a more intuitive approach than perhaps offered by traditional continuum methods. Because of this flexibility, we believe the discrete modelling framework provides an avenue for biologists and bioengineers to explore the behaviour of tissue systems in a computational laboratory. PMID:26452000

  17. Discrete Element Framework for Modelling Extracellular Matrix, Deformable Cells and Subcellular Components.

    PubMed

    Gardiner, Bruce S; Wong, Kelvin K L; Joldes, Grand R; Rich, Addison J; Tan, Chin Wee; Burgess, Antony W; Smith, David W

    2015-10-01

    This paper presents a framework for modelling biological tissues based on discrete particles. Cell components (e.g. cell membranes, cell cytoskeleton, cell nucleus) and extracellular matrix (e.g. collagen) are represented using collections of particles. Simple particle to particle interaction laws are used to simulate and control complex physical interaction types (e.g. cell-cell adhesion via cadherins, integrin basement membrane attachment, cytoskeletal mechanical properties). Particles may be given the capacity to change their properties and behaviours in response to changes in the cellular microenvironment (e.g., in response to cell-cell signalling or mechanical loadings). Each particle is in effect an 'agent', meaning that the agent can sense local environmental information and respond according to pre-determined or stochastic events. The behaviour of the proposed framework is exemplified through several biological problems of ongoing interest. These examples illustrate how the modelling framework allows enormous flexibility for representing the mechanical behaviour of different tissues, and we argue this is a more intuitive approach than perhaps offered by traditional continuum methods. Because of this flexibility, we believe the discrete modelling framework provides an avenue for biologists and bioengineers to explore the behaviour of tissue systems in a computational laboratory.

  18. Obesity in sub-Saharan Africa: development of an ecological theoretical framework.

    PubMed

    Scott, Alison; Ejikeme, Chinwe Stella; Clottey, Emmanuel Nii; Thomas, Joy Goens

    2013-03-01

    The prevalence of overweight and obesity is increasing in sub-Saharan Africa (SSA). There is a need for theoretical frameworks to catalyze further research and to inform the development of multi-level, context-appropriate interventions. In this commentary, we propose a preliminary ecological theoretical framework to conceptualize factors that contribute to increases in overweight and obesity in SSA. The framework is based on a Causality Continuum model [Coreil et al. Social and Behavioral Foundations of Public Health. Sage Publications, Thousand Oaks] that considers distant, intermediate and proximate influences. The influences incorporated in the model include globalization and urbanization as distant factors; occupation, social relationships, built environment and cultural perceptions of weight as intermediate factors and caloric intake, physical inactivity and genetics as proximate factors. The model illustrates the interaction of factors along a continuum, from the individual to the global marketplace, in shaping trends in overweight and obesity in SSA. The framework will be presented, each influence elucidated and implications for research and intervention development discussed. There is a tremendous need for further research on obesity in SSA. An improved evidence base will serve to validate and develop the proposed framework further.

  19. Multi-View Budgeted Learning under Label and Feature Constraints Using Label-Guided Graph-Based Regularization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Symons, Christopher T; Arel, Itamar

    2011-01-01

    Budgeted learning under constraints on both the amount of labeled information and the availability of features at test time pertains to a large number of real world problems. Ideas from multi-view learning, semi-supervised learning, and even active learning have applicability, but a common framework whose assumptions fit these problem spaces is non-trivial to construct. We leverage ideas from these fields based on graph regularizers to construct a robust framework for learning from labeled and unlabeled samples in multiple views that are non-independent and include features that are inaccessible at the time the model would need to be applied. We describemore » examples of applications that fit this scenario, and we provide experimental results to demonstrate the effectiveness of knowledge carryover from training-only views. As learning algorithms are applied to more complex applications, relevant information can be found in a wider variety of forms, and the relationships between these information sources are often quite complex. The assumptions that underlie most learning algorithms do not readily or realistically permit the incorporation of many of the data sources that are available, despite an implicit understanding that useful information exists in these sources. When multiple information sources are available, they are often partially redundant, highly interdependent, and contain noise as well as other information that is irrelevant to the problem under study. In this paper, we are focused on a framework whose assumptions match this reality, as well as the reality that labeled information is usually sparse. Most significantly, we are interested in a framework that can also leverage information in scenarios where many features that would be useful for learning a model are not available when the resulting model will be applied. As with constraints on labels, there are many practical limitations on the acquisition of potentially useful features. A key difference in the case of feature acquisition is that the same constraints often don't pertain to the training samples. This difference provides an opportunity to allow features that are impractical in an applied setting to nevertheless add value during the model-building process. Unfortunately, there are few machine learning frameworks built on assumptions that allow effective utilization of features that are only available at training time. In this paper we formulate a knowledge carryover framework for the budgeted learning scenario with constraints on features and labels. The approach is based on multi-view and semi-supervised learning methods that use graph-encoded regularization. Our main contributions are the following: (1) we propose and provide justification for a methodology for ensuring that changes in the graph regularizer using alternate views are performed in a manner that is target-concept specific, allowing value to be obtained from noisy views; and (2) we demonstrate how this general set-up can be used to effectively improve models by leveraging features unavailable at test time. The rest of the paper is structured as follows. In Section 2, we outline real-world problems to motivate the approach and describe relevant prior work. Section 3 describes the graph construction process and the learning methodologies that are employed. Section 4 provides preliminary discussion regarding theoretical motivation for the method. In Section 5, effectiveness of the approach is demonstrated in a series of experiments employing modified versions of two well-known semi-supervised learning algorithms. Section 6 concludes the paper.« less

  20. Building a Progressive-Situational Model of Post-Diagnosis Information Seeking for Parents of Individuals With Down Syndrome

    PubMed Central

    Gibson, Amelia N.

    2016-01-01

    This grounded theory study used in-depth, semi-structured interview to examine the information-seeking behaviors of 35 parents of children with Down syndrome. Emergent themes include a progressive pattern of behavior including information overload and avoidance, passive attention, and active information seeking; varying preferences between tacit and explicit information at different stages; and selection of information channels and sources that varied based on personal and situational constraints. Based on the findings, the author proposes a progressive model of health information seeking and a framework for using this model to collect data in practice. The author also discusses the practical and theoretical implications of a responsive, progressive approach to understanding parents’ health information–seeking behavior. PMID:28462351

  1. Determination of network origin-destination matrices using partial link traffic counts and virtual sensor information in an integrated corridor management framework.

    DOT National Transportation Integrated Search

    2014-04-01

    Trip origin-destination (O-D) demand matrices are critical components in transportation network : modeling, and provide essential information on trip distributions and corresponding spatiotemporal : traffic patterns in traffic zones in vehicular netw...

  2. Information Processing of Trauma.

    ERIC Educational Resources Information Center

    Hartman, Carol R.; Burgess, Ann W.

    1993-01-01

    This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

  3. Analyzing Evolving Social Network 2 (EVOLVE2)

    DTIC Science & Technology

    2015-04-01

    Facebook friendship graph. We simulated two different interaction models: one-to-one and one-to-many interactions . Both types of models revealed...to an unbiased random walk on the reweighed “ interaction graph” W with entries wij = αiAijαj . The generalized Laplacian framework is flexible enough...Information Intelligence Systems & Analysis Division Information Directorate This report is published in the interest of scientific and technical

  4. Diagnostic Reasoning using Prognostic Information for Unmanned Aerial Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Roychoudhury, Indranil; Kulkarni, Chetan

    2015-01-01

    With increasing popularity of unmanned aircraft, continuous monitoring of their systems, software, and health status is becoming more and more important to ensure safe, correct, and efficient operation and fulfillment of missions. The paper presents integration of prognosis models and prognostic information with the R2U2 (REALIZABLE, RESPONSIVE, and UNOBTRUSIVE Unit) monitoring and diagnosis framework. This integration makes available statistically reliable health information predictions of the future at a much earlier time to enable autonomous decision making. The prognostic information can be used in the R2U2 model to improve diagnostic accuracy and enable decisions to be made at the present time to deal with events in the future. This will be an advancement over the current state of the art, where temporal logic observers can only do such valuation at the end of the time interval. Usefulness and effectiveness of this integrated diagnostics and prognostics framework was demonstrated using simulation experiments with the NASA Dragon Eye electric unmanned aircraft.

  5. Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains.

    PubMed

    Pillow, Jonathan W; Ahmadian, Yashar; Paninski, Liam

    2011-01-01

    One of the central problems in systems neuroscience is to understand how neural spike trains convey sensory information. Decoding methods, which provide an explicit means for reading out the information contained in neural spike responses, offer a powerful set of tools for studying the neural coding problem. Here we develop several decoding methods based on point-process neural encoding models, or forward models that predict spike responses to stimuli. These models have concave log-likelihood functions, which allow efficient maximum-likelihood model fitting and stimulus decoding. We present several applications of the encoding model framework to the problem of decoding stimulus information from population spike responses: (1) a tractable algorithm for computing the maximum a posteriori (MAP) estimate of the stimulus, the most probable stimulus to have generated an observed single- or multiple-neuron spike train response, given some prior distribution over the stimulus; (2) a gaussian approximation to the posterior stimulus distribution that can be used to quantify the fidelity with which various stimulus features are encoded; (3) an efficient method for estimating the mutual information between the stimulus and the spike trains emitted by a neural population; and (4) a framework for the detection of change-point times (the time at which the stimulus undergoes a change in mean or variance) by marginalizing over the posterior stimulus distribution. We provide several examples illustrating the performance of these estimators with simulated and real neural data.

  6. PHOTOTOXIC POLYCYCLIC AROMATIC HYDROCARBONS IN SEDIMENTS: A MODEL-BASED APPROACH FOR ASSESSING RISK

    EPA Science Inventory

    Over the past five years we have developed a number of models which will be combined in an integrated framework with chemical-monitoring information to assess the potential for widespread risk of phototoxic PAHs in sediments.

  7. MULTIMEDIA ENVIRONMENTAL DISTRIBUTION OF TOXICS (MEND-TOX): PART I, HYBRID COMPARTMENTAL-SPATIAL MODELING FRAMEWORK

    EPA Science Inventory

    An integrated hybrid spatial-compartmental modeling approach is presented for analyzing the dynamic distribution of chemicals in the multimedia environment. Information obtained from such analysis, which includes temporal chemical concentration profiles in various media, mass ...

  8. Interactive Sonification Exploring Emergent Behavior Applying Models for Biological Information and Listening

    PubMed Central

    Choi, Insook

    2018-01-01

    Sonification is an open-ended design task to construct sound informing a listener of data. Understanding application context is critical for shaping design requirements for data translation into sound. Sonification requires methodology to maintain reproducibility when data sources exhibit non-linear properties of self-organization and emergent behavior. This research formalizes interactive sonification in an extensible model to support reproducibility when data exhibits emergent behavior. In the absence of sonification theory, extensibility demonstrates relevant methods across case studies. The interactive sonification framework foregrounds three factors: reproducible system implementation for generating sonification; interactive mechanisms enhancing a listener's multisensory observations; and reproducible data from models that characterize emergent behavior. Supramodal attention research suggests interactive exploration with auditory feedback can generate context for recognizing irregular patterns and transient dynamics. The sonification framework provides circular causality as a signal pathway for modeling a listener interacting with emergent behavior. The extensible sonification model adopts a data acquisition pathway to formalize functional symmetry across three subsystems: Experimental Data Source, Sound Generation, and Guided Exploration. To differentiate time criticality and dimensionality of emerging dynamics, tuning functions are applied between subsystems to maintain scale and symmetry of concurrent processes and temporal dynamics. Tuning functions accommodate sonification design strategies that yield order parameter values to render emerging patterns discoverable as well as rehearsable, to reproduce desired instances for clinical listeners. Case studies are implemented with two computational models, Chua's circuit and Swarm Chemistry social agent simulation, generating data in real-time that exhibits emergent behavior. Heuristic Listening is introduced as an informal model of a listener's clinical attention to data sonification through multisensory interaction in a context of structured inquiry. Three methods are introduced to assess the proposed sonification framework: Listening Scenario classification, data flow Attunement, and Sonification Design Patterns to classify sound control. Case study implementations are assessed against these methods comparing levels of abstraction between experimental data and sound generation. Outcomes demonstrate the framework performance as a reference model for representing experimental implementations, also for identifying common sonification structures having different experimental implementations, identifying common functions implemented in different subsystems, and comparing impact of affordances across multiple implementations of listening scenarios. PMID:29755311

  9. A Conceptual Framework for Institutional Research in Community Colleges.

    ERIC Educational Resources Information Center

    Alfred, Richard L.; Ivens, Stephen H.

    This paper defines a conceptual model for institutional research in the community college and identifies sources of information, programs, and services that provide data necessary for implementation of the model. The model contains four specific subsystems: goal setting, program development, program review, and cost effectiveness. Each subsystem…

  10. A Comprehensive Leadership Education Model To Train, Teach, and Develop Leadership in Youth.

    ERIC Educational Resources Information Center

    Ricketts, John C.; Rudd, Rick D.

    2002-01-01

    Meta-analysis of youth leadership development literature resulted in a conceptual model and curriculum framework. Model dimensions are leadership knowledge and information; leadership attitudes, will, and desire; decision making, reasoning, and critical thinking; oral and written communication; and intra/interpersonal relations. Dimensions have…

  11. The Dual-Factor Model of Mental Health: Further Study of the Determinants of Group Differences

    ERIC Educational Resources Information Center

    Lyons, Michael D.; Huebner, E. Scott; Hills, Kimberly J.; Shinkareva, Svetlana V.

    2012-01-01

    Consistent with a positive psychology framework, this study examined the contributions of personality, environmental, and perceived social support variables in classifying adolescents using Greenspoon and Saklofske's Dual-Factor model of mental health. This model incorporates information about positive subjective well-being (SWB), along with…

  12. Averaging Models: Parameters Estimation with the R-Average Procedure

    ERIC Educational Resources Information Center

    Vidotto, G.; Massidda, D.; Noventa, S.

    2010-01-01

    The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…

  13. A case study to illustrate the utility of the Aggregate Exposure Pathway and Adverse Outcome Pathway frameworks for integrating human health and ecological data into cumulative risk assessment

    EPA Science Inventory

    Cumulative risk assessment (CRA) methods, which evaluate the risk of multiple adverse outcomes (AOs) from multiple chemicals, promote the use of a conceptual site model (CSM) to integrate risk from relevant stressors. The Adverse Outcome Pathway (AOP) framework can inform these r...

  14. A Bayesian framework based on a Gaussian mixture model and radial-basis-function Fisher discriminant analysis (BayGmmKda V1.1) for spatial prediction of floods

    NASA Astrophysics Data System (ADS)

    Tien Bui, Dieu; Hoang, Nhat-Duc

    2017-09-01

    In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM), radial-basis-function Fisher discriminant analysis (RBFDA), and a geographic information system (GIS) database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.

  15. LOD BIM Element specification for Railway Turnout Systems Risk Mitigation using the Information Delivery Manual

    NASA Astrophysics Data System (ADS)

    Gigante-Barrera, Ángel; Dindar, Serdar; Kaewunruen, Sakdirat; Ruikar, Darshan

    2017-10-01

    Railway turnouts are complex systems designed using complex geometries and grades which makes them difficult to be managed in terms of risk prevention. This feature poses a substantial peril to rail users as it is considered a cause of derailment. In addition, derailment deals to financial losses due to operational downtimes and monetary compensations in case of death or injure. These are fundamental drivers to consider mitigating risks arising from poor risk management during design. Prevention through design (PtD) is a process that introduces tacit knowledge from industry professionals during the design process. There is evidence that Building Information Modelling (BIM) can help to mitigate risk since the inception of the project. BIM is considered an Information System (IS) were tacit knowledge can be stored and retrieved from a digital database making easy to take promptly decisions as information is ready to be analysed. BIM at the model element level entails working with 3D elements and embedded data, therefore adding a layer of complexity to the management of information along the different stages of the project and across different disciplines. In order to overcome this problem, the industry has created a framework for model progression specification named Level of Development (LOD). The paper presents an IDM based framework for design risk mitigation through code validation using the LOD. This effort resulted on risk datasets which describe graphically and non-graphically a rail turnout as the model progresses. Thus, permitting its inclusion within risk information systems. The assignment of an LOD construct to a set of data, requires specialised management and process related expertise. Furthermore, the selection of a set of LOD constructs requires a purpose based analysis. Therefore, a framework for LOD constructs implementation within the IDM for code checking is required for the industry to progress in this particular field.

  16. Expert decision-making strategies

    NASA Technical Reports Server (NTRS)

    Mosier, Kathleen L.

    1991-01-01

    A recognition-primed decisions (RPD) model is employed as a framework to investigate crew decision-making processes. The quality of information transfer, a critical component of the team RPD model and an indicator of the team's 'collective consciouness', is measured and analyzed with repect to crew performance. As indicated by the RPD model, timing and patterns of information search transfer were expected to reflect extensive and continual situation assessment, and serial evaluation of alternative states of the world or decision response options.

  17. General Nonlinear Ferroelectric Model v. Beta

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Wen; Robbins, Josh

    2017-03-14

    The purpose of this software is to function as a generalized ferroelectric material model. The material model is designed to work with existing finite element packages by providing updated information on material properties that are nonlinear and dependent on loading history. The two major nonlinear phenomena this model captures are domain-switching and phase transformation. The software itself does not contain potentially sensitive material information and instead provides a framework for different physical phenomena observed within ferroelectric materials. The model is calibrated to a specific ferroelectric material through input parameters provided by the user.

  18. An ecological framework for informing permitting decisions on scientific activities in protected areas

    PubMed Central

    Saarman, Emily T.; Owens, Brian; Murray, Steven N.; Weisberg, Stephen B.; Field, John C.; Nielsen, Karina J.

    2018-01-01

    There are numerous reasons to conduct scientific research within protected areas, but research activities may also negatively impact organisms and habitats, and thus conflict with a protected area’s conservation goals. We developed a quantitative ecological decision-support framework that estimates these potential impacts so managers can weigh costs and benefits of proposed research projects and make informed permitting decisions. The framework generates quantitative estimates of the ecological impacts of the project and the cumulative impacts of the proposed project and all other projects in the protected area, and then compares the estimated cumulative impacts of all projects with policy-based acceptable impact thresholds. We use a series of simplified equations (models) to assess the impacts of proposed research to: a) the population of any targeted species, b) the major ecological assemblages that make up the community, and c) the physical habitat that supports protected area biota. These models consider both targeted and incidental impacts to the ecosystem and include consideration of the vulnerability of targeted species, assemblages, and habitats, based on their recovery time and ecological role. We parameterized the models for a wide variety of potential research activities that regularly occur in the study area using a combination of literature review and expert judgment with a precautionary approach to uncertainty. We also conducted sensitivity analyses to examine the relationships between model input parameters and estimated impacts to understand the dominant drivers of the ecological impact estimates. Although the decision-support framework was designed for and adopted by the California Department of Fish and Wildlife for permitting scientific studies in the state-wide network of marine protected areas (MPAs), the framework can readily be adapted for terrestrial and freshwater protected areas. PMID:29920527

  19. Measuring individual disaster recovery: a socioecological framework.

    PubMed

    Abramson, David M; Stehling-Ariza, Tasha; Park, Yoon Soo; Walsh, Lauren; Culp, Derrin

    2010-09-01

    Disaster recovery is a complex phenomenon. Too often, recovery is measured in singular fashion, such as quantifying rebuilt infrastructure or lifelines, without taking in to account the affected population's individual and community recovery. A comprehensive framework is needed that encompasses a much broader and far-reaching construct with multiple underlying dimensions and numerous causal pathways; without the consideration of a comprehensive framework that investigates relationships between these factors, an accurate measurement of recovery may not be valid. This study proposes a model that encapsulates these ideas into a single framework, the Socio-Ecological Model of Recovery. Using confirmatory factor analysis, an operational measure of recovery was developed and validated using the five measures of housing stability, economic stability, physical health, mental health, and social role adaptation. The data were drawn from a sample of displaced households following Hurricane Katrina. Measures of psychological strength, risk, disaster exposure, neighborhood contextual effects, and formal and informal help were modeled to examine their direct and indirect effects on recovery using a structural equation model. All five elements of the recovery measure were positively correlated with a latent measure of recovery, although mental health and social role adaptation displayed the strongest associations. An individual's psychological strength had the greatest association with positive recovery, followed by having a household income greater than $20,000 and having informal social support. Those factors most strongly associated with an absence of recovery included the time displaced since the hurricane, being disabled, and living in a community with substantial social disorder. The socio-ecological framework provides a robust means for measuring recovery, and for testing those factors associated with the presence or absence of recovery.

  20. Simulation and experimental studies of operators` decision styles and crew composition while using an ecological and traditional user interface for the control room of a nuclear power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meshkati, N.; Buller, B.J.; Azadeh, M.A.

    1995-04-01

    The goal of this research is threefold: (1) use of the Skill-, Rule-, and Knowledge-based levels of cognitive control -- the SRK framework -- to develop an integrated information processing conceptual framework (for integration of workstation, job, and team design); (2) to evaluate the user interface component of this framework -- the Ecological display; and (3) to analyze the effect of operators` individual information processing behavior and decision styles on handling plant disturbances plus their performance on, and preference for, Traditional and Ecological user interfaces. A series of studies were conducted. In Part I, a computer simulation model and amore » mathematical model were developed. In Part II, an experiment was designed and conducted at the EBR-II plant of the Argonne National Laboratory-West in Idaho Falls, Idaho. It is concluded that: the integrated SRK-based information processing model for control room operations is superior to the conventional rule-based model; operators` individual decision styles and the combination of their styles play a significant role in effective handling of nuclear power plant disturbances; use of the Ecological interface results in significantly more accurate event diagnosis and recall of various plant parameters, faster response to plant transients, and higher ratings of subject preference; and operators` decision styles affect on both their performance and preference for the Ecological interface.« less

  1. Information and the Experience of Wonder a Rhetorical Study of Information Design

    ERIC Educational Resources Information Center

    Jun, Soojin

    2011-01-01

    In the last two decades, emotion has emerged as an important theme in discussions of design. However, there is no framework to date that encompasses both emotion and information design in a single theory. This dissertation was motivated by a lack of substantive theory that would allow design researchers and educators to model the relationships…

  2. Health Sciences Libraries Forecasting Information Service Trends for Researchers: Models Applicable to All Academic Libraries

    ERIC Educational Resources Information Center

    Cain, Timothy J.; Cheek, Fern M.; Kupsco, Jeremy; Hartel, Lynda J.; Getselman, Anna

    2016-01-01

    To better understand the value of current information services and to forecast the evolving information and data management needs of researchers, a study was conducted at two research-intensive universities. The methodology and planning framework applied by health science librarians at Emory University and The Ohio State University focused on…

  3. Refining the aggregate exposure pathway.

    PubMed

    Tan, Yu-Mei; Leonard, Jeremy A; Edwards, Stephen; Teeguarden, Justin; Egeghy, Peter

    2018-03-01

    Advancements in measurement technologies and modeling capabilities continue to result in an abundance of exposure information, adding to that currently in existence. However, fragmentation within the exposure science community acts as an obstacle for realizing the vision set forth in the National Research Council's report on Exposure Science in the 21 st century to consider exposures from source to dose, on multiple levels of integration, and to multiple stressors. The concept of an Aggregate Exposure Pathway (AEP) was proposed as a framework for organizing and integrating diverse exposure information that exists across numerous repositories and among multiple scientific fields. A workshop held in May 2016 followed introduction of the AEP concept, allowing members of the exposure science community to provide extensive evaluation and feedback regarding the framework's structure, key components, and applications. The current work briefly introduces topics discussed at the workshop and attempts to address key challenges involved in refining this framework. The resulting evolution in the AEP framework's features allows for facilitating acquisition, integration, organization, and transparent application and communication of exposure knowledge in a manner that is independent of its ultimate use, thereby enabling reuse of such information in many applications.

  4. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  5. STATISTICAL GROWTH MODELING OF LONGITUDINAL DT-MRI FOR REGIONAL CHARACTERIZATION OF EARLY BRAIN DEVELOPMENT.

    PubMed

    Sadeghi, Neda; Prastawa, Marcel; Fletcher, P Thomas; Gilmore, John H; Lin, Weili; Gerig, Guido

    2012-01-01

    A population growth model that represents the growth trajectories of individual subjects is critical to study and understand neurodevelopment. This paper presents a framework for jointly estimating and modeling individual and population growth trajectories, and determining significant regional differences in growth pattern characteristics applied to longitudinal neuroimaging data. We use non-linear mixed effect modeling where temporal change is modeled by the Gompertz function. The Gompertz function uses intuitive parameters related to delay, rate of change, and expected asymptotic value; all descriptive measures which can answer clinical questions related to growth. Our proposed framework combines nonlinear modeling of individual trajectories, population analysis, and testing for regional differences. We apply this framework to the study of early maturation in white matter regions as measured with diffusion tensor imaging (DTI). Regional differences between anatomical regions of interest that are known to mature differently are analyzed and quantified. Experiments with image data from a large ongoing clinical study show that our framework provides descriptive, quantitative information on growth trajectories that can be directly interpreted by clinicians. To our knowledge, this is the first longitudinal analysis of growth functions to explain the trajectory of early brain maturation as it is represented in DTI.

  6. Ecosystem Services and Climate Change Considerations for ...

    EPA Pesticide Factsheets

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework “iemWatersheds” has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water

  7. A rhetorical approach to environmental information sharing

    NASA Astrophysics Data System (ADS)

    Woolf, Andrew

    2014-05-01

    `Faceted search' has recently been widely adopted as a powerful information discovery framework, enabling users to navigate a complex landscape of information by successive refinement along key dimensions. The compelling user experience that results has seen adoption of faceted search by online retailers, media outlets, and encyclopedic publishers. A key challenge with faceted browse is the choice of suitable search dimensions, or facets. Conventional facet analysis adopts principles of exclusivity and exhaustiveness; identifying facets on their relevance to the subject and discrimination ability (Spiteri, 1998). The rhetoricians of ancient Greece defined seven dimensions (`circumstances') of analytical enquiry: who, what, when, where, why, in what way, by what means. These provide a broadly applicable framework that may be seen in Ranganathan's classic (`PMEST') scheme for facet analysis. The utility of the `Five Ws' is also manifest through their adoption in daily discourse and pedagogical frameworks. If we apply the `Five Ws' to environmental information, we arrive at a model very close to the `O&M' (ISO 19156) conceptual model for standardised exchange of environmental observation and measurements data: * who: metadata * what: observed property * when: time of observation * where: feature of interest * why: metadata * how: procedure Thus, we adopt an approach for distributed environmental information sharing which factors the architecture into components aligned with the `Five Ws' (or O&M). We give an overview of this architecture and its information classes, components, interfaces and standards. We also describe how it extends the classic SDI architecture to provide additional specific benefit for environmental information. Finally, we offer a perspective on the architecture which may be seen as a `brokering' overlay to environmental information resources, enabling an O&M-conformant view. The approach to be presented is being adopted by the Australian Bureau of Meteorology as the basis for a National Environmental Information Infrastructure.

  8. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.

  9. Generalized Full-Information Item Bifactor Analysis

    ERIC Educational Resources Information Center

    Cai, Li; Yang, Ji Seung; Hansen, Mark

    2011-01-01

    Full-information item bifactor analysis is an important statistical method in psychological and educational measurement. Current methods are limited to single-group analysis and inflexible in the types of item response models supported. We propose a flexible multiple-group item bifactor analysis framework that supports a variety of…

  10. The quadriceps muscle of knee joint modelling Using Hybrid Particle Swarm Optimization-Neural Network (PSO-NN)

    NASA Astrophysics Data System (ADS)

    Kamaruddin, Saadi Bin Ahmad; Marponga Tolos, Siti; Hee, Pah Chin; Ghani, Nor Azura Md; Ramli, Norazan Mohamed; Nasir, Noorhamizah Binti Mohamed; Ksm Kader, Babul Salam Bin; Saiful Huq, Mohammad

    2017-03-01

    Neural framework has for quite a while been known for its ability to handle a complex nonlinear system without a logical model and can learn refined nonlinear associations gives. Theoretically, the most surely understood computation to set up the framework is the backpropagation (BP) count which relies on upon the minimization of the mean square error (MSE). However, this algorithm is not totally efficient in the presence of outliers which usually exist in dynamic data. This paper exhibits the modelling of quadriceps muscle model by utilizing counterfeit smart procedures named consolidated backpropagation neural network nonlinear autoregressive (BPNN-NAR) and backpropagation neural network nonlinear autoregressive moving average (BPNN-NARMA) models in view of utilitarian electrical incitement (FES). We adapted particle swarm optimization (PSO) approach to enhance the performance of backpropagation algorithm. In this research, a progression of tests utilizing FES was led. The information that is gotten is utilized to build up the quadriceps muscle model. 934 preparing information, 200 testing and 200 approval information set are utilized as a part of the improvement of muscle model. It was found that both BPNN-NAR and BPNN-NARMA performed well in modelling this type of data. As a conclusion, the neural network time series models performed reasonably efficient for non-linear modelling such as active properties of the quadriceps muscle with one input, namely output namely muscle force.

  11. Development of a GIS-based integrated framework for coastal seiches monitoring and forecasting: A North Jiangsu shoal case study

    NASA Astrophysics Data System (ADS)

    Qin, Rufu; Lin, Liangzhao

    2017-06-01

    Coastal seiches have become an increasingly important issue in coastal science and present many challenges, particularly when attempting to provide warning services. This paper presents the methodologies, techniques and integrated services adopted for the design and implementation of a Seiches Monitoring and Forecasting Integration Framework (SMAF-IF). The SMAF-IF is an integrated system with different types of sensors and numerical models and incorporates the Geographic Information System (GIS) and web techniques, which focuses on coastal seiche events detection and early warning in the North Jiangsu shoal, China. The in situ sensors perform automatic and continuous monitoring of the marine environment status and the numerical models provide the meteorological and physical oceanographic parameter estimates. A model outputs processing software was developed in C# language using ArcGIS Engine functions, which provides the capabilities of automatically generating visualization maps and warning information. Leveraging the ArcGIS Flex API and ASP.NET web services, a web based GIS framework was designed to facilitate quasi real-time data access, interactive visualization and analysis, and provision of early warning services for end users. The integrated framework proposed in this study enables decision-makers and the publics to quickly response to emergency coastal seiche events and allows an easy adaptation to other regional and scientific domains related to real-time monitoring and forecasting.

  12. A Query Expansion Framework in Image Retrieval Domain Based on Local and Global Analysis

    PubMed Central

    Rahman, M. M.; Antani, S. K.; Thoma, G. R.

    2011-01-01

    We present an image retrieval framework based on automatic query expansion in a concept feature space by generalizing the vector space model of information retrieval. In this framework, images are represented by vectors of weighted concepts similar to the keyword-based representation used in text retrieval. To generate the concept vocabularies, a statistical model is built by utilizing Support Vector Machine (SVM)-based classification techniques. The images are represented as “bag of concepts” that comprise perceptually and/or semantically distinguishable color and texture patches from local image regions in a multi-dimensional feature space. To explore the correlation between the concepts and overcome the assumption of feature independence in this model, we propose query expansion techniques in the image domain from a new perspective based on both local and global analysis. For the local analysis, the correlations between the concepts based on the co-occurrence pattern, and the metrical constraints based on the neighborhood proximity between the concepts in encoded images, are analyzed by considering local feedback information. We also analyze the concept similarities in the collection as a whole in the form of a similarity thesaurus and propose an efficient query expansion based on the global analysis. The experimental results on a photographic collection of natural scenes and a biomedical database of different imaging modalities demonstrate the effectiveness of the proposed framework in terms of precision and recall. PMID:21822350

  13. An Exploratory Study of Cost Engineering in Axiomatic Design: Creation of the Cost Model Based on an FR-DP Map

    NASA Technical Reports Server (NTRS)

    Lee, Taesik; Jeziorek, Peter

    2004-01-01

    Large complex projects cost large sums of money throughout their life cycle for a variety of reasons and causes. For such large programs, the credible estimation of the project cost, a quick assessment of the cost of making changes, and the management of the project budget with effective cost reduction determine the viability of the project. Cost engineering that deals with these issues requires a rigorous method and systematic processes. This paper introduces a logical framework to a&e effective cost engineering. The framework is built upon Axiomatic Design process. The structure in the Axiomatic Design process provides a good foundation to closely tie engineering design and cost information together. The cost framework presented in this paper is a systematic link between the functional domain (FRs), physical domain (DPs), cost domain (CUs), and a task/process-based model. The FR-DP map relates a system s functional requirements to design solutions across all levels and branches of the decomposition hierarchy. DPs are mapped into CUs, which provides a means to estimate the cost of design solutions - DPs - from the cost of the physical entities in the system - CUs. The task/process model describes the iterative process ot-developing each of the CUs, and is used to estimate the cost of CUs. By linking the four domains, this framework provides a superior traceability from requirements to cost information.

  14. Excess mortality in persons with severe mental disorders: a multilevel intervention framework and priorities for clinical practice, policy and research agendas

    PubMed Central

    Liu, Nancy H.; Daumit, Gail L.; Dua, Tarun; Aquila, Ralph; Charlson, Fiona; Cuijpers, Pim; Druss, Benjamin; Dudek, Kenn; Freeman, Melvyn; Fujii, Chiyo; Gaebel, Wolfgang; Hegerl, Ulrich; Levav, Itzhak; Munk Laursen, Thomas; Ma, Hong; Maj, Mario; Elena Medina‐Mora, Maria; Nordentoft, Merete; Prabhakaran, Dorairaj; Pratt, Karen; Prince, Martin; Rangaswamy, Thara; Shiers, David; Susser, Ezra; Thornicroft, Graham; Wahlbeck, Kristian; Fekadu Wassie, Abe; Whiteford, Harvey; Saxena, Shekhar

    2017-01-01

    Excess mortality in persons with severe mental disorders (SMD) is a major public health challenge that warrants action. The number and scope of truly tested interventions in this area remain limited, and strategies for implementation and scaling up of programmes with a strong evidence base are scarce. Furthermore, the majority of available interventions focus on a single or an otherwise limited number of risk factors. Here we present a multilevel model highlighting risk factors for excess mortality in persons with SMD at the individual, health system and socio‐environmental levels. Informed by that model, we describe a comprehensive framework that may be useful for designing, implementing and evaluating interventions and programmes to reduce excess mortality in persons with SMD. This framework includes individual‐focused, health system‐focused, and community level and policy‐focused interventions. Incorporating lessons learned from the multilevel model of risk and the comprehensive intervention framework, we identify priorities for clinical practice, policy and research agendas. PMID:28127922

  15. Physiologically based pharmacokinetic (PBPK) modeling considering methylated trivalent arsenicals

    EPA Science Inventory

    PBPK modeling provides a quantitative biologically-based framework to integrate diverse types of information for application to risk analysis. For example, genetic polymorphisms in arsenic metabolizing enzymes (AS3MT) can lead to differences in target tissue dosimetry for key tri...

  16. The criterion of subscale sufficiency and its application to the relationship between static capillary pressure, saturation and interfacial areas.

    PubMed

    Kurzeja, Patrick

    2016-05-01

    Modern imaging techniques, increased simulation capabilities and extended theoretical frameworks, naturally drive the development of multiscale modelling by the question: which new information should be considered? Given the need for concise constitutive relationships and efficient data evaluation; however, one important question is often neglected: which information is sufficient? For this reason, this work introduces the formalized criterion of subscale sufficiency. This criterion states whether a chosen constitutive relationship transfers all necessary information from micro to macroscale within a multiscale framework. It further provides a scheme to improve constitutive relationships. Direct application to static capillary pressure demonstrates usefulness and conditions for subscale sufficiency of saturation and interfacial areas.

  17. A conceptual framework for the collection of food products in a Total Diet Study.

    PubMed

    Turrini, Aida; Lombardi-Boccia, Ginevra; Aureli, Federica; Cubadda, Francesco; D'Addezio, Laura; D'Amato, Marilena; D'Evoli, Laura; Darnerud, PerOla; Devlin, Niamh; Dias, Maria Graça; Jurković, Marina; Kelleher, Cecily; Le Donne, Cinzia; López Esteban, Maite; Lucarini, Massimo; Martinez Burgos, Maria Alba; Martínez-Victoria, Emilio; McNulty, Breige; Mistura, Lorenza; Nugent, Anne; Oktay Basegmez, Hatice Imge; Oliveira, Luisa; Ozer, Hayrettin; Perelló, Gemma; Pite, Marina; Presser, Karl; Sokolić, Darja; Vasco, Elsa; Volatier, Jean-Luc

    2018-02-01

    A total diet study (TDS) provides representative and realistic data for assessing the dietary intake of chemicals, such as contaminants and residues, and nutrients, at a population level. Reproducing the diet through collection of customarily consumed foods and their preparation as habitually eaten is crucial to ensure representativeness, i.e., all relevant foods are included and all potential dietary sources of the substances investigated are captured. Having this in mind, a conceptual framework for building a relevant food-shopping list was developed as a research task in the European Union's 7th Framework Program project, 'Total Diet Study Exposure' (TDS-Exposure), aimed at standardising methods for food sampling, analyses, exposure assessment calculations and modelling, priority foods, and selection of chemical contaminants. A stepwise approach following the knowledge translation (KT) model for concept analysis is proposed to set up a general protocol for the collection of food products in a TDS in terms of steps (characterisation of the food list, development of the food-shopping list, food products collection) and pillars (background documentation, procedures, and tools). A simple model for structuring the information in a way to support the implementation of the process, by presenting relevant datasets, forms to store inherent information, and folders to record the results is also proposed. Reproducibility of the process and possibility to exploit the gathered information are two main features of such a system for future applications.

  18. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  19. A Riemannian framework for orientation distribution function computing.

    PubMed

    Cheng, Jian; Ghosh, Aurobrata; Jiang, Tianzi; Deriche, Rachid

    2009-01-01

    Compared with Diffusion Tensor Imaging (DTI), High Angular Resolution Imaging (HARDI) can better explore the complex microstructure of white matter. Orientation Distribution Function (ODF) is used to describe the probability of the fiber direction. Fisher information metric has been constructed for probability density family in Information Geometry theory and it has been successfully applied for tensor computing in DTI. In this paper, we present a state of the art Riemannian framework for ODF computing based on Information Geometry and sparse representation of orthonormal bases. In this Riemannian framework, the exponential map, logarithmic map and geodesic have closed forms. And the weighted Frechet mean exists uniquely on this manifold. We also propose a novel scalar measurement, named Geometric Anisotropy (GA), which is the Riemannian geodesic distance between the ODF and the isotropic ODF. The Renyi entropy H1/2 of the ODF can be computed from the GA. Moreover, we present an Affine-Euclidean framework and a Log-Euclidean framework so that we can work in an Euclidean space. As an application, Lagrange interpolation on ODF field is proposed based on weighted Frechet mean. We validate our methods on synthetic and real data experiments. Compared with existing Riemannian frameworks on ODF, our framework is model-free. The estimation of the parameters, i.e. Riemannian coordinates, is robust and linear. Moreover it should be noted that our theoretical results can be used for any probability density function (PDF) under an orthonormal basis representation.

  20. GIS and Geodatabase Disaster Risk for Spatial Planning

    NASA Astrophysics Data System (ADS)

    Hendriawan Nur, Wawan; Kumoro, Yugo; Susilowati, Yuliana

    2018-02-01

    The spatial planning in Indonesia needs to consider the information on the potential disaster. That is because disaster is a serious and detrimental problem that often occurs and causes casualties in some areas in Indonesia as well as inhibits the development. Various models and research were developed to calculate disaster risk assessment. GIS is a system for assembling, storing, analyzing, and displaying geographically referenced disaster. The information can be collaborated with geodatabases to model and to estimate disaster risk in an automated way. It also offers the possibility to customize most of the parameters used in the models. This paper describes a framework which can improve GIS and Geodatabase for the vulnerability, capacity or disaster risk assessment to support the spatial planning activities so they can be more adaptable. By using this framework, GIS application can be used in any location by adjusting variables or calculation methods without changing or rebuilding system from scratch.

  1. Knowledge Extraction from Atomically Resolved Images.

    PubMed

    Vlcek, Lukas; Maksov, Artem; Pan, Minghu; Vasudevan, Rama K; Kalinin, Sergei V

    2017-10-24

    Tremendous strides in experimental capabilities of scanning transmission electron microscopy and scanning tunneling microscopy (STM) over the past 30 years made atomically resolved imaging routine. However, consistent integration and use of atomically resolved data with generative models is unavailable, so information on local thermodynamics and other microscopic driving forces encoded in the observed atomic configurations remains hidden. Here, we present a framework based on statistical distance minimization to consistently utilize the information available from atomic configurations obtained from an atomically resolved image and extract meaningful physical interaction parameters. We illustrate the applicability of the framework on an STM image of a FeSe x Te 1-x superconductor, with the segregation of the chalcogen atoms investigated using a nonideal interacting solid solution model. This universal method makes full use of the microscopic degrees of freedom sampled in an atomically resolved image and can be extended via Bayesian inference toward unbiased model selection with uncertainty quantification.

  2. Using a theory-driven conceptual framework in qualitative health research.

    PubMed

    Macfarlane, Anne; O'Reilly-de Brún, Mary

    2012-05-01

    The role and merits of highly inductive research designs in qualitative health research are well established, and there has been a powerful proliferation of grounded theory method in the field. However, tight qualitative research designs informed by social theory can be useful to sensitize researchers to concepts and processes that they might not necessarily identify through inductive processes. In this article, we provide a reflexive account of our experience of using a theory-driven conceptual framework, the Normalization Process Model, in a qualitative evaluation of general practitioners' uptake of a free, pilot, language interpreting service in the Republic of Ireland. We reflect on our decisions about whether or not to use the Model, and describe our actual use of it to inform research questions, sampling, coding, and data analysis. We conclude with reflections on the added value that the Model and tight design brought to our research.

  3. An Integrated Modeling Framework Forecasting Ecosystem Exposure-- A Systems Approach to the Cumulative Impacts of Multiple Stressors

    NASA Astrophysics Data System (ADS)

    Johnston, J. M.

    2013-12-01

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.

  4. Exploiting Acoustic and Syntactic Features for Automatic Prosody Labeling in a Maximum Entropy Framework

    PubMed Central

    Sridhar, Vivek Kumar Rangarajan; Bangalore, Srinivas; Narayanan, Shrikanth S.

    2009-01-01

    In this paper, we describe a maximum entropy-based automatic prosody labeling framework that exploits both language and speech information. We apply the proposed framework to both prominence and phrase structure detection within the Tones and Break Indices (ToBI) annotation scheme. Our framework utilizes novel syntactic features in the form of supertags and a quantized acoustic–prosodic feature representation that is similar to linear parameterizations of the prosodic contour. The proposed model is trained discriminatively and is robust in the selection of appropriate features for the task of prosody detection. The proposed maximum entropy acoustic–syntactic model achieves pitch accent and boundary tone detection accuracies of 86.0% and 93.1% on the Boston University Radio News corpus, and, 79.8% and 90.3% on the Boston Directions corpus. The phrase structure detection through prosodic break index labeling provides accuracies of 84% and 87% on the two corpora, respectively. The reported results are significantly better than previously reported results and demonstrate the strength of maximum entropy model in jointly modeling simple lexical, syntactic, and acoustic features for automatic prosody labeling. PMID:19603083

  5. A framework for characterizing eHealth literacy demands and barriers.

    PubMed

    Chan, Connie V; Kaufman, David R

    2011-11-17

    Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.

  6. A Framework for Characterizing eHealth Literacy Demands and Barriers

    PubMed Central

    Chan, Connie V

    2011-01-01

    Background Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. Objective We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. Methods We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. Results The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. Conclusions The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum. PMID:22094891

  7. Divergences and estimating tight bounds on Bayes error with applications to multivariate Gaussian copula and latent Gaussian copula

    NASA Astrophysics Data System (ADS)

    Thelen, Brian J.; Xique, Ismael J.; Burns, Joseph W.; Goley, G. Steven; Nolan, Adam R.; Benson, Jonathan W.

    2017-04-01

    In Bayesian decision theory, there has been a great amount of research into theoretical frameworks and information- theoretic quantities that can be used to provide lower and upper bounds for the Bayes error. These include well-known bounds such as Chernoff, Battacharrya, and J-divergence. Part of the challenge of utilizing these various metrics in practice is (i) whether they are "loose" or "tight" bounds, (ii) how they might be estimated via either parametric or non-parametric methods, and (iii) how accurate the estimates are for limited amounts of data. In general what is desired is a methodology for generating relatively tight lower and upper bounds, and then an approach to estimate these bounds efficiently from data. In this paper, we explore the so-called triangle divergence which has been around for a while, but was recently made more prominent in some recent research on non-parametric estimation of information metrics. Part of this work is motivated by applications for quantifying fundamental information content in SAR/LIDAR data, and to help in this, we have developed a flexible multivariate modeling framework based on multivariate Gaussian copula models which can be combined with the triangle divergence framework to quantify this information, and provide approximate bounds on Bayes error. In this paper we present an overview of the bounds, including those based on triangle divergence and verify that under a number of multivariate models, the upper and lower bounds derived from triangle divergence are significantly tighter than the other common bounds, and often times, dramatically so. We also propose some simple but effective means for computing the triangle divergence using Monte Carlo methods, and then discuss estimation of the triangle divergence from empirical data based on Gaussian Copula models.

  8. Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework.

    PubMed

    Nahum-Shani, Inbal; Hekler, Eric B; Spruijt-Metz, Donna

    2015-12-01

    Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)-suites of interventions that adapt over time to an individual's changing status and circumstances with the goal to address the individual's need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely, the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  9. The ACTIVE conceptual framework as a structural equation model

    PubMed Central

    Gross, Alden L.; Payne, Brennan R.; Casanova, Ramon; Davoudzadeh, Pega; Dzierzewski, Joseph M.; Farias, Sarah; Giovannetti, Tania; Ip, Edward H.; Marsiske, Michael; Rebok, George W.; Schaie, K. Warner; Thomas, Kelsey; Willis, Sherry; Jones, Richard N.

    2018-01-01

    Background/Study Context Conceptual frameworks are analytic models at a high level of abstraction. Their operationalization can inform randomized trial design and sample size considerations. Methods The Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) conceptual framework was empirically tested using structural equation modeling (N=2,802). ACTIVE was guided by a conceptual framework for cognitive training in which proximal cognitive abilities (memory, inductive reasoning, speed of processing) mediate treatment-related improvement in primary outcomes (everyday problem-solving, difficulty with activities of daily living, everyday speed, driving difficulty), which in turn lead to improved secondary outcomes (health-related quality of life, health service utilization, mobility). Measurement models for each proximal, primary, and secondary outcome were developed and tested using baseline data. Each construct was then combined in one model to evaluate fit (RMSEA, CFI, normalized residuals of each indicator). To expand the conceptual model and potentially inform future trials, evidence of modification of structural model parameters was evaluated by age, years of education, sex, race, and self-rated health status. Results Preconceived measurement models for memory, reasoning, speed of processing, everyday problem-solving, instrumental activities of daily living (IADL) difficulty, everyday speed, driving difficulty, and health-related quality of life each fit well to the data (all RMSEA < .05; all CFI > .95). Fit of the full model was excellent (RMSEA = .038; CFI = .924). In contrast with previous findings from ACTIVE regarding who benefits from training, interaction testing revealed associations between proximal abilities and primary outcomes are stronger on average by nonwhite race, worse health, older age, and less education (p < .005). Conclusions Empirical data confirm the hypothesized ACTIVE conceptual model. Findings suggest that the types of people who show intervention effects on cognitive performance potentially may be different from those with the greatest chance of transfer to real-world activities. PMID:29303475

  10. Generic, network schema agnostic sparse tensor factorization for single-pass clustering of heterogeneous information networks

    PubMed Central

    Meng, Qinggang; Deng, Su; Huang, Hongbin; Wu, Yahui; Badii, Atta

    2017-01-01

    Heterogeneous information networks (e.g. bibliographic networks and social media networks) that consist of multiple interconnected objects are ubiquitous. Clustering analysis is an effective method to understand the semantic information and interpretable structure of the heterogeneous information networks, and it has attracted the attention of many researchers in recent years. However, most studies assume that heterogeneous information networks usually follow some simple schemas, such as bi-typed networks or star network schema, and they can only cluster one type of object in the network each time. In this paper, a novel clustering framework is proposed based on sparse tensor factorization for heterogeneous information networks, which can cluster multiple types of objects simultaneously in a single pass without any network schema information. The types of objects and the relations between them in the heterogeneous information networks are modeled as a sparse tensor. The clustering issue is modeled as an optimization problem, which is similar to the well-known Tucker decomposition. Then, an Alternating Least Squares (ALS) algorithm and a feasible initialization method are proposed to solve the optimization problem. Based on the tensor factorization, we simultaneously partition different types of objects into different clusters. The experimental results on both synthetic and real-world datasets have demonstrated that our proposed clustering framework, STFClus, can model heterogeneous information networks efficiently and can outperform state-of-the-art clustering algorithms as a generally applicable single-pass clustering method for heterogeneous network which is network schema agnostic. PMID:28245222

  11. Generic, network schema agnostic sparse tensor factorization for single-pass clustering of heterogeneous information networks.

    PubMed

    Wu, Jibing; Meng, Qinggang; Deng, Su; Huang, Hongbin; Wu, Yahui; Badii, Atta

    2017-01-01

    Heterogeneous information networks (e.g. bibliographic networks and social media networks) that consist of multiple interconnected objects are ubiquitous. Clustering analysis is an effective method to understand the semantic information and interpretable structure of the heterogeneous information networks, and it has attracted the attention of many researchers in recent years. However, most studies assume that heterogeneous information networks usually follow some simple schemas, such as bi-typed networks or star network schema, and they can only cluster one type of object in the network each time. In this paper, a novel clustering framework is proposed based on sparse tensor factorization for heterogeneous information networks, which can cluster multiple types of objects simultaneously in a single pass without any network schema information. The types of objects and the relations between them in the heterogeneous information networks are modeled as a sparse tensor. The clustering issue is modeled as an optimization problem, which is similar to the well-known Tucker decomposition. Then, an Alternating Least Squares (ALS) algorithm and a feasible initialization method are proposed to solve the optimization problem. Based on the tensor factorization, we simultaneously partition different types of objects into different clusters. The experimental results on both synthetic and real-world datasets have demonstrated that our proposed clustering framework, STFClus, can model heterogeneous information networks efficiently and can outperform state-of-the-art clustering algorithms as a generally applicable single-pass clustering method for heterogeneous network which is network schema agnostic.

  12. Comparative and Predictive Multimedia Assessments Using Monte Carlo Uncertainty Analyses

    NASA Astrophysics Data System (ADS)

    Whelan, G.

    2002-05-01

    Multiple-pathway frameworks (sometimes referred to as multimedia models) provide a platform for combining medium-specific environmental models and databases, such that they can be utilized in a more holistic assessment of contaminant fate and transport in the environment. These frameworks provide a relatively seamless transfer of information from one model to the next and from databases to models. Within these frameworks, multiple models are linked, resulting in models that consume information from upstream models and produce information to be consumed by downstream models. The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) is an example, which allows users to link their models to other models and databases. FRAMES is an icon-driven, site-layout platform that is an open-architecture, object-oriented system that interacts with environmental databases; helps the user construct a Conceptual Site Model that is real-world based; allows the user to choose the most appropriate models to solve simulation requirements; solves the standard risk paradigm of release transport and fate; and exposure/risk assessments to people and ecology; and presents graphical packages for analyzing results. FRAMES is specifically designed allow users to link their own models into a system, which contains models developed by others. This paper will present the use of FRAMES to evaluate potential human health exposures using real site data and realistic assumptions from sources, through the vadose and saturated zones, to exposure and risk assessment at three real-world sites, using the Multimedia Environmental Pollutant Assessment System (MEPAS), which is a multimedia model contained within FRAMES. These real-world examples use predictive and comparative approaches coupled with a Monte Carlo analysis. A predictive analysis is where models are calibrated to monitored site data, prior to the assessment, and a comparative analysis is where models are not calibrated but based solely on literature or judgement and is usually used to compare alternatives. In many cases, a combination is employed where the model is calibrated to a portion of the data (e.g., to determine hydrodynamics), then used to compare alternatives. Three subsurface-based multimedia examples are presented, increasing in complexity. The first presents the application of a predictive, deterministic assessment; the second presents a predictive and comparative, Monte Carlo analysis; and the third presents a comparative, multi-dimensional Monte Carlo analysis. Endpoints are typically presented in terms of concentration, hazard, risk, and dose, and because the vadose zone model typically represents a connection between a source and the aquifer, it does not generally represent the final medium in a multimedia risk assessment.

  13. Clustering and Bayesian hierarchical modeling for the definition of informative prior distributions in hydrogeology

    NASA Astrophysics Data System (ADS)

    Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.

    2017-12-01

    In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.

  14. A judgment and decision-making model for plant behavior.

    PubMed

    Karban, Richard; Orrock, John L

    2018-06-12

    Recently plant biologists have documented that plants, like animals, engage in many activities that can be considered as behaviors, although plant biologists currently lack a conceptual framework to understand these processes. Borrowing the well-established framework developed by psychologists, we propose that plant behaviors can be constructively modeled by identifying four distinct components: 1) a cue or stimulus that provides information, 2) a judgment whereby the plant perceives and processes this informative cue, 3) a decision whereby the plant chooses among several options based on their relative costs and benefits, and 4) action. Judgment for plants can be determined empirically by monitoring signaling associated with electrical, calcium, or hormonal fluxes. Decision-making can be evaluated empirically by monitoring gene expression or differential allocation of resources. We provide examples of the utility of this judgment and decision-making framework by considering cases in which plants either successfully or unsuccessfully induced resistance against attacking herbivores. Separating judgment from decision-making suggests new analytical paradigms (i.e., Bayesian methods for judgment and economic utility models for decision-making). Following this framework, we propose an experimental approach to plant behavior that explicitly manipulates the stimuli provided to plants, uses plants that vary in sensory abilities, and examines how environmental context affects plant responses. The concepts and approaches that follow from the judgment and decision-making framework can shape how we study and understand plant-herbivore interactions, biological invasions, plant responses to climate change, and the susceptibility of plants to evolutionary traps. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Framework for a clinical information system.

    PubMed

    Van De Velde, R; Lansiers, R; Antonissen, G

    2002-01-01

    The design and implementation of Clinical Information System architecture is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the "middle" tier apply the clinical (business) model and application rules. The main characteristics are the focus on modelling and reuse of both data and business logic. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.

  16. The Effect of Visual Information on the Manual Approach and Landing

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1982-01-01

    The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.

  17. Improving imperfect data from health management information systems in Africa using space-time geostatistics.

    PubMed

    Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M

    2006-06-01

    Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels.

  18. Improving Imperfect Data from Health Management Information Systems in Africa Using Space–Time Geostatistics

    PubMed Central

    Gething, Peter W; Noor, Abdisalan M; Gikandi, Priscilla W; Ogara, Esther A. A; Hay, Simon I; Nixon, Mark S; Snow, Robert W; Atkinson, Peter M

    2006-01-01

    Background Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS) exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens. Methods and Findings This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale. Conclusions The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels. PMID:16719557

  19. Total Force Fitness in units part 1: military demand-resource model.

    PubMed

    Bates, Mark J; Fallesen, Jon J; Huey, Wesley S; Packard, Gary A; Ryan, Diane M; Burke, C Shawn; Smith, David G; Watola, Daniel J; Pinder, Evette D; Yosick, Todd M; Estrada, Armando X; Crepeau, Loring; Bowles, Stephen V

    2013-11-01

    The military unit is a critical center of gravity in the military's efforts to enhance resilience and the health of the force. The purpose of this article is to augment the military's Total Force Fitness (TFF) guidance with a framework of TFF in units. The framework is based on a Military Demand-Resource model that highlights the dynamic interactions across demands, resources, and outcomes. A joint team of subject-matter experts identified key variables representing unit fitness demands, resources, and outcomes. The resulting framework informs and supports leaders, support agencies, and enterprise efforts to strengthen TFF in units by (1) identifying TFF unit variables aligned with current evidence and operational practices, (2) standardizing communication about TFF in units across the Department of Defense enterprise in a variety of military organizational contexts, (3) improving current resources including evidence-based actions for leaders, (4) identifying and addressing of gaps, and (5) directing future research for enhancing TFF in units. These goals are intended to inform and enhance Service efforts to develop Service-specific TFF models, as well as provide the conceptual foundation for a follow-on article about TFF metrics for units. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  20. Generalized multiple kernel learning with data-dependent priors.

    PubMed

    Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li

    2015-06-01

    Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less

  2. Extending information retrieval methods to personalized genomic-based studies of disease.

    PubMed

    Ye, Shuyun; Dawson, John A; Kendziorski, Christina

    2014-01-01

    Genomic-based studies of disease now involve diverse types of data collected on large groups of patients. A major challenge facing statistical scientists is how best to combine the data, extract important features, and comprehensively characterize the ways in which they affect an individual's disease course and likelihood of response to treatment. We have developed a survival-supervised latent Dirichlet allocation (survLDA) modeling framework to address these challenges. Latent Dirichlet allocation (LDA) models have proven extremely effective at identifying themes common across large collections of text, but applications to genomics have been limited. Our framework extends LDA to the genome by considering each patient as a "document" with "text" detailing his/her clinical events and genomic state. We then further extend the framework to allow for supervision by a time-to-event response. The model enables the efficient identification of collections of clinical and genomic features that co-occur within patient subgroups, and then characterizes each patient by those features. An application of survLDA to The Cancer Genome Atlas ovarian project identifies informative patient subgroups showing differential response to treatment, and validation in an independent cohort demonstrates the potential for patient-specific inference.

  3. Communication Privacy Disclosure Management: An Empirical Study of Socialization Support in a Pseudo-Online Course

    ERIC Educational Resources Information Center

    Heo, Misook

    2011-01-01

    This study investigated the boundaries of online learners' information disclosure, relationship building, interpersonal integration, and motivation by drawing upon the theoretical frameworks of the social information processing and communication privacy management theories and the hyperpersonal model. A total of 103 students from a higher…

  4. Aligning business and information technology domains: strategic planning in hospitals.

    PubMed

    Henderson, J C; Thomas, J B

    1992-01-01

    This article develops a framework for strategic information technology (IT) management in hospitals, termed the Strategic Alignment Model. This model is defined in terms of four domains--business strategy, IT strategy, organizational infrastructure, and IT infrastructure--each with its constituent components. The concept of strategic alignment is developed using two fundamental dimensions--strategic fit and integration. Different perspectives that hospitals use for aligning the various domains are discussed, and a prescriptive model of strategic IT planning is proposed.

  5. A conceptual evolutionary aseismic decision support framework for hospitals

    NASA Astrophysics Data System (ADS)

    Hu, Yufeng; Dargush, Gary F.; Shao, Xiaoyun

    2012-12-01

    In this paper, aconceptual evolutionary framework for aseismic decision support for hospitalsthat attempts to integrate a range of engineering and sociotechnical models is presented. Genetic algorithms are applied to find the optimal decision sets. A case study is completed to demonstrate how the frameworkmay applytoa specific hospital.The simulations show that the proposed evolutionary decision support framework is able to discover robust policy sets in either uncertain or fixed environments. The framework also qualitatively identifies some of the characteristicbehavior of the critical care organization. Thus, by utilizing the proposedframework, the decision makers are able to make more informed decisions, especially toenhance the seismic safety of the hospitals.

  6. Bayesian Modeling of Exposure and Airflow Using Two-Zone Models

    PubMed Central

    Zhang, Yufen; Banerjee, Sudipto; Yang, Rui; Lungu, Claudiu; Ramachandran, Gurumurthy

    2009-01-01

    Mathematical modeling is being increasingly used as a means for assessing occupational exposures. However, predicting exposure in real settings is constrained by lack of quantitative knowledge of exposure determinants. Validation of models in occupational settings is, therefore, a challenge. Not only do the model parameters need to be known, the models also need to predict the output with some degree of accuracy. In this paper, a Bayesian statistical framework is used for estimating model parameters and exposure concentrations for a two-zone model. The model predicts concentrations in a zone near the source and far away from the source as functions of the toluene generation rate, air ventilation rate through the chamber, and the airflow between near and far fields. The framework combines prior or expert information on the physical model along with the observed data. The framework is applied to simulated data as well as data obtained from the experiments conducted in a chamber. Toluene vapors are generated from a source under different conditions of airflow direction, the presence of a mannequin, and simulated body heat of the mannequin. The Bayesian framework accounts for uncertainty in measurement as well as in the unknown rate of airflow between the near and far fields. The results show that estimates of the interzonal airflow are always close to the estimated equilibrium solutions, which implies that the method works efficiently. The predictions of near-field concentration for both the simulated and real data show nice concordance with the true values, indicating that the two-zone model assumptions agree with the reality to a large extent and the model is suitable for predicting the contaminant concentration. Comparison of the estimated model and its margin of error with the experimental data thus enables validation of the physical model assumptions. The approach illustrates how exposure models and information on model parameters together with the knowledge of uncertainty and variability in these quantities can be used to not only provide better estimates of model outputs but also model parameters. PMID:19403840

  7. A modeling framework for the establishment and spread of invasive species in heterogeneous environments.

    PubMed

    Lustig, Audrey; Worner, Susan P; Pitt, Joel P W; Doscher, Crile; Stouffer, Daniel B; Senay, Senait D

    2017-10-01

    Natural and human-induced events are continuously altering the structure of our landscapes and as a result impacting the spatial relationships between individual landscape elements and the species living in the area. Yet, only recently has the influence of the surrounding landscape on invasive species spread started to be considered. The scientific community increasingly recognizes the need for broader modeling framework that focuses on cross-study comparisons at different spatiotemporal scales. Using two illustrative examples, we introduce a general modeling framework that allows for a systematic investigation of the effect of habitat change on invasive species establishment and spread. The essential parts of the framework are (i) a mechanistic spatially explicit model (a modular dispersal framework-MDIG) that allows population dynamics and dispersal to be modeled in a geographical information system (GIS), (ii) a landscape generator that allows replicated landscape patterns with partially controllable spatial properties to be generated, and (iii) landscape metrics that depict the essential aspects of landscape with which dispersal and demographic processes interact. The modeling framework provides functionality for a wide variety of applications ranging from predictions of the spatiotemporal spread of real species and comparison of potential management strategies, to theoretical investigation of the effect of habitat change on population dynamics. Such a framework allows to quantify how small-grain landscape characteristics, such as habitat size and habitat connectivity, interact with life-history traits to determine the dynamics of invasive species spread in fragmented landscape. As such, it will give deeper insights into species traits and landscape features that lead to establishment and spread success and may be key to preventing new incursions and the development of efficient monitoring, surveillance, control or eradication programs.

  8. Nonlinear finite element model updating for damage identification of civil structures using batch Bayesian estimation

    NASA Astrophysics Data System (ADS)

    Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.; de Callafon, Raymond A.

    2017-02-01

    This paper presents a framework for structural health monitoring (SHM) and damage identification of civil structures. This framework integrates advanced mechanics-based nonlinear finite element (FE) modeling and analysis techniques with a batch Bayesian estimation approach to estimate time-invariant model parameters used in the FE model of the structure of interest. The framework uses input excitation and dynamic response of the structure and updates a nonlinear FE model of the structure to minimize the discrepancies between predicted and measured response time histories. The updated FE model can then be interrogated to detect, localize, classify, and quantify the state of damage and predict the remaining useful life of the structure. As opposed to recursive estimation methods, in the batch Bayesian estimation approach, the entire time history of the input excitation and output response of the structure are used as a batch of data to estimate the FE model parameters through a number of iterations. In the case of non-informative prior, the batch Bayesian method leads to an extended maximum likelihood (ML) estimation method to estimate jointly time-invariant model parameters and the measurement noise amplitude. The extended ML estimation problem is solved efficiently using a gradient-based interior-point optimization algorithm. Gradient-based optimization algorithms require the FE response sensitivities with respect to the model parameters to be identified. The FE response sensitivities are computed accurately and efficiently using the direct differentiation method (DDM). The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem by computing the exact Fisher Information matrix using the FE response sensitivities with respect to the model parameters. The accuracy of the proposed uncertainty quantification approach is verified using a sampling approach based on the unscented transformation. Two validation studies, based on realistic structural FE models of a bridge pier and a moment resisting steel frame, are performed to validate the performance and accuracy of the presented nonlinear FE model updating approach and demonstrate its application to SHM. These validation studies show the excellent performance of the proposed framework for SHM and damage identification even in the presence of high measurement noise and/or way-out initial estimates of the model parameters. Furthermore, the detrimental effects of the input measurement noise on the performance of the proposed framework are illustrated and quantified through one of the validation studies.

  9. Scientific Digital Libraries, Interoperability, and Ontologies

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  10. A New Mathematical Framework for Design Under Uncertainty

    DTIC Science & Technology

    2016-05-05

    blending multiple information sources via auto-regressive stochastic modeling. A computationally efficient machine learning framework is developed based on...sion and machine learning approaches; see Fig. 1. This will lead to a comprehensive description of system performance with less uncertainty than in the...Bayesian optimization of super-cavitating hy- drofoils The goal of this study is to demonstrate the capabilities of statistical learning and

  11. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  12. A methodology proposal for collaborative business process elaboration using a model-driven approach

    NASA Astrophysics Data System (ADS)

    Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé

    2015-05-01

    Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).

  13. Trauma-Informed Social Policy: A Conceptual Framework for Policy Analysis and Advocacy

    PubMed Central

    Murshid, Nadine Shaanta

    2016-01-01

    Trauma-informed care is a service provision model used across a range of practice settings. Drawing on an extensive body of research on trauma (broadly defined as experiences that produce enduring emotional pain and distress) and health outcomes, we have argued that the principles of trauma-informed care can be extended to social policy. Citing a variety of health-related policy examples, we have described how policy can better reflect 6 core principles of trauma-informed care: safety, trustworthiness and transparency, collaboration, empowerment, choice, and intersectionality. This framework conveys a politicized understanding of trauma, reflecting the reality that trauma and its effects are not equally distributed, and offers a pathway for public health professionals to disrupt trauma-driven health disparities through policy action. PMID:26691122

  14. Words in Puddles of Sound: Modelling Psycholinguistic Effects in Speech Segmentation

    ERIC Educational Resources Information Center

    Monaghan, Padraic; Christiansen, Morten H.

    2010-01-01

    There are numerous models of how speech segmentation may proceed in infants acquiring their first language. We present a framework for considering the relative merits and limitations of these various approaches. We then present a model of speech segmentation that aims to reveal important sources of information for speech segmentation, and to…

  15. Getting a Picture that Is Both Accurate and Stable: Situation Models and Epistemic Validation

    ERIC Educational Resources Information Center

    Schroeder, Sascha; Richter, Tobias; Hoever, Inga

    2008-01-01

    Text comprehension entails the construction of a situation model that prepares individuals for situated action. In order to meet this function, situation model representations are required to be both accurate and stable. We propose a framework according to which comprehenders rely on epistemic validation to prevent inaccurate information from…

  16. Assessing Two Theoretical Frameworks of Civic Engagement

    ERIC Educational Resources Information Center

    García-Cabrero, Benilde; Pérez-Martínez, María Guadalupe; Sandoval-Hernández, Andrés; Caso-Niebla, Joaquín; Díaz-López, Carlos David

    2016-01-01

    The purpose of this study was to empirically test two major theoretical models: a modified version of the social capital model (Pattie, Seyd and Whiteley, 2003), and the Informed Social Engagement Model (Barr and Selman, 2014; Selman and Kwok, 2010), to explain civic participation and civic knowledge of adolescents from Chile, Colombia and Mexico,…

  17. Video quality assessment using a statistical model of human visual speed perception.

    PubMed

    Wang, Zhou; Li, Qiang

    2007-12-01

    Motion is one of the most important types of information contained in natural video, but direct use of motion information in the design of video quality assessment algorithms has not been deeply investigated. Here we propose to incorporate a recent model of human visual speed perception [Nat. Neurosci. 9, 578 (2006)] and model visual perception in an information communication framework. This allows us to estimate both the motion information content and the perceptual uncertainty in video signals. Improved video quality assessment algorithms are obtained by incorporating the model as spatiotemporal weighting factors, where the weight increases with the information content and decreases with the perceptual uncertainty. Consistent improvement over existing video quality assessment algorithms is observed in our validation with the video quality experts group Phase I test data set.

  18. Habitat Suitability Index Models: Red king crab

    USGS Publications Warehouse

    Jewett, Stephen C.; Onuf, Christopher P.

    1988-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for evaluating habitat of different life stages of red king crab (Paralithodes camtschatica). A model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) and 1.0 (optimum habitat) in Alaskan coastal waters, especially in the Gulf of Alaska and the southeastern Bering Sea. HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  19. LEARNING SEMANTICS-ENHANCED LANGUAGE MODELS APPLIED TO UNSUEPRVISED WSD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VERSPOOR, KARIN; LIN, SHOU-DE

    An N-gram language model aims at capturing statistical syntactic word order information from corpora. Although the concept of language models has been applied extensively to handle a variety of NLP problems with reasonable success, the standard model does not incorporate semantic information, and consequently limits its applicability to semantic problems such as word sense disambiguation. We propose a framework that integrates semantic information into the language model schema, allowing a system to exploit both syntactic and semantic information to address NLP problems. Furthermore, acknowledging the limited availability of semantically annotated data, we discuss how the proposed model can be learnedmore » without annotated training examples. Finally, we report on a case study showing how the semantics-enhanced language model can be applied to unsupervised word sense disambiguation with promising results.« less

  20. Nature as a network of morphological infocomputational processes for cognitive agents

    NASA Astrophysics Data System (ADS)

    Dodig-Crnkovic, Gordana

    2017-01-01

    This paper presents a view of nature as a network of infocomputational agents organized in a dynamical hierarchy of levels. It provides a framework for unification of currently disparate understandings of natural, formal, technical, behavioral and social phenomena based on information as a structure, differences in one system that cause the differences in another system, and computation as its dynamics, i.e. physical process of morphological change in the informational structure. We address some of the frequent misunderstandings regarding the natural/morphological computational models and their relationships to physical systems, especially cognitive systems such as living beings. Natural morphological infocomputation as a conceptual framework necessitates generalization of models of computation beyond the traditional Turing machine model presenting symbol manipulation, and requires agent-based concurrent resource-sensitive models of computation in order to be able to cover the whole range of phenomena from physics to cognition. The central role of agency, particularly material vs. cognitive agency is highlighted.

  1. Evaluation of hazard and integrity monitor functions for integrated alerting and notification using a sensor simulation framework

    NASA Astrophysics Data System (ADS)

    Bezawada, Rajesh; Uijt de Haag, Maarten

    2010-04-01

    This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.

  2. Frameworks for risk communication and disease management: the case of Lyme disease and countryside users

    PubMed Central

    Quine, Christopher P.; Barnett, Julie; Dobson, Andrew D. M.; Marcu, Afrodita; Marzano, Mariella; Moseley, Darren; O'Brien, Liz; Randolph, Sarah E.; Taylor, Jennifer L.; Uzzell, David

    2011-01-01

    Management of zoonotic disease is necessary if countryside users are to gain benefit rather than suffer harm from their activities, and to avoid disproportionate reaction to novel threats. We introduce a conceptual framework based on the pressure–state–response model with five broad responses to disease incidence. Influencing public behaviour is one response and requires risk communication based on an integration of knowledge about the disease with an understanding of how publics respond to precautionary advice. A second framework emphasizes how risk communication involves more than information provision and should address dimensions including points-of-intervention over time, place and audience. The frameworks are developed by reference to tick-borne Lyme borreliosis (also known as Lyme disease), for which informed precautionary behaviour is particularly relevant. Interventions to influence behaviour can be directed by knowledge of spatial and temporal variation of tick abundance, what constitutes risky behaviour, how people respond to information of varying content, and an understanding of the social practices related to countryside use. The frameworks clarify the response options and help identify who is responsible for risk communication. These aspects are not consistently understood, and may result in an underestimation of the role of land-based organizations in facilitating appropriate precautionary behaviour. PMID:21624921

  3. Frameworks for risk communication and disease management: the case of Lyme disease and countryside users.

    PubMed

    Quine, Christopher P; Barnett, Julie; Dobson, Andrew D M; Marcu, Afrodita; Marzano, Mariella; Moseley, Darren; O'Brien, Liz; Randolph, Sarah E; Taylor, Jennifer L; Uzzell, David

    2011-07-12

    Management of zoonotic disease is necessary if countryside users are to gain benefit rather than suffer harm from their activities, and to avoid disproportionate reaction to novel threats. We introduce a conceptual framework based on the pressure-state-response model with five broad responses to disease incidence. Influencing public behaviour is one response and requires risk communication based on an integration of knowledge about the disease with an understanding of how publics respond to precautionary advice. A second framework emphasizes how risk communication involves more than information provision and should address dimensions including points-of-intervention over time, place and audience. The frameworks are developed by reference to tick-borne Lyme borreliosis (also known as Lyme disease), for which informed precautionary behaviour is particularly relevant. Interventions to influence behaviour can be directed by knowledge of spatial and temporal variation of tick abundance, what constitutes risky behaviour, how people respond to information of varying content, and an understanding of the social practices related to countryside use. The frameworks clarify the response options and help identify who is responsible for risk communication. These aspects are not consistently understood, and may result in an underestimation of the role of land-based organizations in facilitating appropriate precautionary behaviour.

  4. Goal setting and action planning in the rehabilitation setting: development of a theoretically informed practice framework.

    PubMed

    Scobbie, Lesley; Dixon, Diane; Wyke, Sally

    2011-05-01

    Setting and achieving goals is fundamental to rehabilitation practice but has been criticized for being a-theoretical and the key components of replicable goal-setting interventions are not well established. To describe the development of a theory-based goal setting practice framework for use in rehabilitation settings and to detail its component parts. Causal modelling was used to map theories of behaviour change onto the process of setting and achieving rehabilitation goals, and to suggest the mechanisms through which patient outcomes are likely to be affected. A multidisciplinary task group developed the causal model into a practice framework for use in rehabilitation settings through iterative discussion and implementation with six patients. Four components of a goal-setting and action-planning practice framework were identified: (i) goal negotiation, (ii) goal identification, (iii) planning, and (iv) appraisal and feedback. The variables hypothesized to effect change in patient outcomes were self-efficacy and action plan attainment. A theory-based goal setting practice framework for use in rehabilitation settings is described. The framework requires further development and systematic evaluation in a range of rehabilitation settings.

  5. The ISACA Business Model for Information Security: An Integrative and Innovative Approach

    NASA Astrophysics Data System (ADS)

    von Roessing, Rolf

    In recent years, information security management has matured into a professional discipline that covers both technical and managerial aspects in an organisational environment. Information security is increasingly dependent on business-driven parameters and interfaces to a variety of organisational units and departments. In contrast, common security models and frameworks have remained largely technical. A review of extant models ranging from [LaBe73] to more recent models shows that technical aspects are covered in great detail, while the managerial aspects of security are often neglected.Likewise, the business view on organisational security is frequently at odds with the demands of information security personnel or information technology management. In practice, senior and executive level management remain comparatively distant from technical requirements. As a result, information security is generally regarded as a cost factor rather than a benefit to the organisation.

  6. A Conceptual Model of the Information Requirements of Nursing Organizations

    PubMed Central

    Miller, Emmy

    1989-01-01

    Three related issues play a role in the identification of the information requirements of nursing organizations. These issues are the current state of computer systems in health care organizations, the lack of a well-defined data set for nursing, and the absence of models representing data and information relevant to clinical and administrative nursing practice. This paper will examine current methods of data collection, processing, and storage in clinical and administrative nursing practice for the purpose of identifying the information requirements of nursing organizations. To satisfy these information requirements, database technology can be used; however, a model for database design is needed that reflects the conceptual framework of nursing and the professional concerns of nurses. A conceptual model of the types of data necessary to produce the desired information will be presented and the relationships among data will be delineated.

  7. Modelling Biogeochemistry Across Domains with The Modular System for Shelves and Coasts (MOSSCO)

    NASA Astrophysics Data System (ADS)

    Burchard, H.; Lemmen, C.; Hofmeister, R.; Knut, K.; Nasermoaddeli, M. H.; Kerimoglu, O.; Koesters, F.; Wirtz, K.

    2016-02-01

    Coastal biogeochemical processes extend from the atmosphere through the water column and the epibenthos into the ocean floor, laterally they are determined by freshwater inflows and open water exchange, and in situ they are mediated by physical, chemical and biological interactions. We use the new Modular System for Shelves and Coasts (MOSSCO, http://www.mossco.de) to obtain an integrated view of coastal biogeochemistry. MOSSCO is a coupling framework that builds on existing coupling technologies like the Earth System Modeling Framework (ESMF, for domain-coupling) and the Framework for Aquatic Biogeochemistry (FABM, for process coupling). MOSSCO facilitates the communication about and the integration of existing and of new process models into a threedimensional regional coastal modelling context. In the MOSSCO concept, the integrating framework imposes very few restrictions on contributed data or models; in fact, there is no distinction made between data and models. The few requirements are: (1) principle coupleability, i.e. access to I/O and timing information in submodels, which has recently been referred to as the Basic Model Interface (BMI) (2) open source/open data access and licencing and (3) communication of metadata, such as spatiotemporal information, naming conventions, and physical units. These requirements suffice to integrate different models and data sets into the MOSSCO infrastructure and subsequently built a modular integrated modeling tool that can span a diversity of processes and domains. Here, we demonstrate a MOSSCO application for the southern North Sea, where atmospheric deposition, biochemical processing in the water column and the ocean floor, lateral nutrient replenishment, and wave- and current-dependent remobilization from sediments are accounted for by modular components. A multi-annual simulation yields realistic succession of the spatial gradients of dissolved nutrients, of chlorophyll variability and gross primary production rates and of benthic denitrification rates for this intriguing coastal system.

  8. Model-Based Fatigue Prognosis of Fiber-Reinforced Laminates Exhibiting Concurrent Damage Mechanisms

    NASA Technical Reports Server (NTRS)

    Corbetta, M.; Sbarufatti, C.; Saxena, A.; Giglio, M.; Goebel, K.

    2016-01-01

    Prognostics of large composite structures is a topic of increasing interest in the field of structural health monitoring for aerospace, civil, and mechanical systems. Along with recent advancements in real-time structural health data acquisition and processing for damage detection and characterization, model-based stochastic methods for life prediction are showing promising results in the literature. Among various model-based approaches, particle-filtering algorithms are particularly capable in coping with uncertainties associated with the process. These include uncertainties about information on the damage extent and the inherent uncertainties of the damage propagation process. Some efforts have shown successful applications of particle filtering-based frameworks for predicting the matrix crack evolution and structural stiffness degradation caused by repetitive fatigue loads. Effects of other damage modes such as delamination, however, are not incorporated in these works. It is well established that delamination and matrix cracks not only co-exist in most laminate structures during the fatigue degradation process but also affect each other's progression. Furthermore, delamination significantly alters the stress-state in the laminates and accelerates the material degradation leading to catastrophic failure. Therefore, the work presented herein proposes a particle filtering-based framework for predicting a structure's remaining useful life with consideration of multiple co-existing damage-mechanisms. The framework uses an energy-based model from the composite modeling literature. The multiple damage-mode model has been shown to suitably estimate the energy release rate of cross-ply laminates as affected by matrix cracks and delamination modes. The model is also able to estimate the reduction in stiffness of the damaged laminate. This information is then used in the algorithms for life prediction capabilities. First, a brief summary of the energy-based damage model is provided. Then, the paper describes how the model is embedded within the prognostic framework and how the prognostics performance is assessed using observations from run-to-failure experiments

  9. National-scale aboveground biomass geostatistical mapping with FIA inventory and GLAS data: Preparation for sparsely sampled lidar assisted forest inventory

    NASA Astrophysics Data System (ADS)

    Babcock, C. R.; Finley, A. O.; Andersen, H. E.; Moskal, L. M.; Morton, D. C.; Cook, B.; Nelson, R.

    2017-12-01

    Upcoming satellite lidar missions, such as GEDI and IceSat-2, are designed to collect laser altimetry data from space for narrow bands along orbital tracts. As a result lidar metric sets derived from these sources will not be of complete spatial coverage. This lack of complete coverage, or sparsity, means traditional regression approaches that consider lidar metrics as explanatory variables (without error) cannot be used to generate wall-to-wall maps of forest inventory variables. We implement a coregionalization framework to jointly model sparsely sampled lidar information and point-referenced forest variable measurements to create wall-to-wall maps with full probabilistic uncertainty quantification of all inputs. We inform the model with USFS Forest Inventory and Analysis (FIA) in-situ forest measurements and GLAS lidar data to spatially predict aboveground forest biomass (AGB) across the contiguous US. We cast our model within a Bayesian hierarchical framework to better model complex space-varying correlation structures among the lidar metrics and FIA data, which yields improved prediction and uncertainty assessment. To circumvent computational difficulties that arise when fitting complex geostatistical models to massive datasets, we use a Nearest Neighbor Gaussian process (NNGP) prior. Results indicate that a coregionalization modeling approach to leveraging sampled lidar data to improve AGB estimation is effective. Further, fitting the coregionalization model within a Bayesian mode of inference allows for AGB quantification across scales ranging from individual pixel estimates of AGB density to total AGB for the continental US with uncertainty. The coregionalization framework examined here is directly applicable to future spaceborne lidar acquisitions from GEDI and IceSat-2. Pairing these lidar sources with the extensive FIA forest monitoring plot network using a joint prediction framework, such as the coregionalization model explored here, offers the potential to improve forest AGB accounting certainty and provide maps for post-model fitting analysis of the spatial distribution of AGB.

  10. A new framework for modeling decisions about changing information: The Piecewise Linear Ballistic Accumulator model

    PubMed Central

    Heathcote, Andrew

    2016-01-01

    In the real world, decision making processes must be able to integrate non-stationary information that changes systematically while the decision is in progress. Although theories of decision making have traditionally been applied to paradigms with stationary information, non-stationary stimuli are now of increasing theoretical interest. We use a random-dot motion paradigm along with cognitive modeling to investigate how the decision process is updated when a stimulus changes. Participants viewed a cloud of moving dots, where the motion switched directions midway through some trials, and were asked to determine the direction of motion. Behavioral results revealed a strong delay effect: after presentation of the initial motion direction there is a substantial time delay before the changed motion information is integrated into the decision process. To further investigate the underlying changes in the decision process, we developed a Piecewise Linear Ballistic Accumulator model (PLBA). The PLBA is efficient to simulate, enabling it to be fit to participant choice and response-time distribution data in a hierarchal modeling framework using a non-parametric approximate Bayesian algorithm. Consistent with behavioral results, PLBA fits confirmed the presence of a long delay between presentation and integration of new stimulus information, but did not support increased response caution in reaction to the change. We also found the decision process was not veridical, as symmetric stimulus change had an asymmetric effect on the rate of evidence accumulation. Thus, the perceptual decision process was slow to react to, and underestimated, new contrary motion information. PMID:26760448

  11. Rule-based graph theory to enable exploration of the space system architecture design space

    NASA Astrophysics Data System (ADS)

    Arney, Dale Curtis

    The primary goal of this research is to improve upon system architecture modeling in order to enable the exploration of design space options. A system architecture is the description of the functional and physical allocation of elements and the relationships, interactions, and interfaces between those elements necessary to satisfy a set of constraints and requirements. The functional allocation defines the functions that each system (element) performs, and the physical allocation defines the systems required to meet those functions. Trading the functionality between systems leads to the architecture-level design space that is available to the system architect. The research presents a methodology that enables the modeling of complex space system architectures using a mathematical framework. To accomplish the goal of improved architecture modeling, the framework meets five goals: technical credibility, adaptability, flexibility, intuitiveness, and exhaustiveness. The framework is technically credible, in that it produces an accurate and complete representation of the system architecture under consideration. The framework is adaptable, in that it provides the ability to create user-specified locations, steady states, and functions. The framework is flexible, in that it allows the user to model system architectures to multiple destinations without changing the underlying framework. The framework is intuitive for user input while still creating a comprehensive mathematical representation that maintains the necessary information to completely model complex system architectures. Finally, the framework is exhaustive, in that it provides the ability to explore the entire system architecture design space. After an extensive search of the literature, graph theory presents a valuable mechanism for representing the flow of information or vehicles within a simple mathematical framework. Graph theory has been used in developing mathematical models of many transportation and network flow problems in the past, where nodes represent physical locations and edges represent the means by which information or vehicles travel between those locations. In space system architecting, expressing the physical locations (low-Earth orbit, low-lunar orbit, etc.) and steady states (interplanetary trajectory) as nodes and the different means of moving between the nodes (propulsive maneuvers, etc.) as edges formulates a mathematical representation of this design space. The selection of a given system architecture using graph theory entails defining the paths that the systems take through the space system architecture graph. A path through the graph is defined as a list of edges that are traversed, which in turn defines functions performed by the system. A structure to compactly represent this information is a matrix, called the system map, in which the column indices are associated with the systems that exist and row indices are associated with the edges, or functions, to which each system has access. Several contributions have been added to the state of the art in space system architecture analysis. The framework adds the capability to rapidly explore the design space without the need to limit trade options or the need for user interaction during the exploration process. The unique mathematical representation of a system architecture, through the use of the adjacency, incidence, and system map matrices, enables automated design space exploration using stochastic optimization processes. The innovative rule-based graph traversal algorithm ensures functional feasibility of each system architecture that is analyzed, and the automatic generation of the system hierarchy eliminates the need for the user to manually determine the relationships between systems during or before the design space exploration process. Finally, the rapid evaluation of system architectures for various mission types enables analysis of the system architecture design space for multiple destinations within an evolutionary exploration program. (Abstract shortened by UMI.).

  12. A framework for understanding cancer comparative effectiveness research data needs.

    PubMed

    Carpenter, William R; Meyer, Anne-Marie; Abernethy, Amy P; Stürmer, Til; Kosorok, Michael R

    2012-11-01

    Randomized controlled trials remain the gold standard for evaluating cancer intervention efficacy. Randomized trials are not always feasible, practical, or timely and often don't adequately reflect patient heterogeneity and real-world clinical practice. Comparative effectiveness research can leverage secondary data to help fill knowledge gaps randomized trials leave unaddressed; however, comparative effectiveness research also faces shortcomings. The goal of this project was to develop a new model and inform an evolving framework articulating cancer comparative effectiveness research data needs. We examined prevalent models and conducted semi-structured discussions with 76 clinicians and comparative effectiveness research researchers affiliated with the Agency for Healthcare Research and Quality's cancer comparative effectiveness research programs. A new model was iteratively developed and presents cancer comparative effectiveness research and important measures in a patient-centered, longitudinal chronic care model better reflecting contemporary cancer care in the context of the cancer care continuum, rather than a single-episode, acute-care perspective. Immediately relevant for federally funded comparative effectiveness research programs, the model informs an evolving framework articulating cancer comparative effectiveness research data needs, including evolutionary enhancements to registries and epidemiologic research data systems. We discuss elements of contemporary clinical practice, methodology improvements, and related needs affecting comparative effectiveness research's ability to yield findings clinicians, policy makers, and stakeholders can confidently act on. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. A framework for understanding cancer comparative effectiveness research data needs

    PubMed Central

    Carpenter, William R; Meyer, Anne-Marie; Abernethy, Amy P.; Stürmer, Til; Kosorok, Michael R.

    2012-01-01

    Objective Randomized controlled trials remain the gold standard for evaluating cancer intervention efficacy. Randomized trials are not always feasible, practical, or timely, and often don’t adequately reflect patient heterogeneity and real-world clinical practice. Comparative effectiveness research can leverage secondary data to help fill knowledge gaps randomized trials leave unaddressed; however, comparative effectiveness research also faces shortcomings. The goal of this project was to develop a new model and inform an evolving framework articulating cancer comparative effectiveness research data needs. Study Design and Setting We examined prevalent models and conducted semi-structured discussions with 76 clinicians and comparative effectiveness research researchers affiliated with the Agency for Healthcare Research and Quality’s cancer comparative effectiveness research programs. Results A new model was iteratively developed, and presents cancer comparative effectiveness research and important measures in a patient-centered, longitudinal chronic care model better-reflecting contemporary cancer care in the context of the cancer care continuum, rather than a single-episode, acute-care perspective. Conclusion Immediately relevant for federally-funded comparative effectiveness research programs, the model informs an evolving framework articulating cancer comparative effectiveness research data needs, including evolutionary enhancements to registries and epidemiologic research data systems. We discuss elements of contemporary clinical practice, methodology improvements, and related needs affecting comparative effectiveness research’s ability to yield findings clinicians, policymakers, and stakeholders can confidently act on. PMID:23017633

  14. The OME Framework for genome-scale systems biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palsson, Bernhard O.; Ebrahim, Ali; Federowicz, Steve

    The life sciences are undergoing continuous and accelerating integration with computational and engineering sciences. The biology that many in the field have been trained on may be hardly recognizable in ten to twenty years. One of the major drivers for this transformation is the blistering pace of advancements in DNA sequencing and synthesis. These advances have resulted in unprecedented amounts of new data, information, and knowledge. Many software tools have been developed to deal with aspects of this transformation and each is sorely needed [1-3]. However, few of these tools have been forced to deal with the full complexity ofmore » genome-scale models along with high throughput genome- scale data. This particular situation represents a unique challenge, as it is simultaneously necessary to deal with the vast breadth of genome-scale models and the dizzying depth of high-throughput datasets. It has been observed time and again that as the pace of data generation continues to accelerate, the pace of analysis significantly lags behind [4]. It is also evident that, given the plethora of databases and software efforts [5-12], it is still a significant challenge to work with genome-scale metabolic models, let alone next-generation whole cell models [13-15]. We work at the forefront of model creation and systems scale data generation [16-18]. The OME Framework was borne out of a practical need to enable genome-scale modeling and data analysis under a unified framework to drive the next generation of genome-scale biological models. Here we present the OME Framework. It exists as a set of Python classes. However, we want to emphasize the importance of the underlying design as an addition to the discussions on specifications of a digital cell. A great deal of work and valuable progress has been made by a number of communities [13, 19-24] towards interchange formats and implementations designed to achieve similar goals. While many software tools exist for handling genome-scale metabolic models or for genome-scale data analysis, no implementations exist that explicitly handle data and models concurrently. The OME Framework structures data in a connected loop with models and the components those models are composed of. This results in the first full, practical implementation of a framework that can enable genome-scale design-build-test. Over the coming years many more software packages will be developed and tools will necessarily change. However, we hope that the underlying designs shared here can help to inform the design of future software.« less

  15. Mathematical Modeling of Cellular Metabolism.

    PubMed

    Berndt, Nikolaus; Holzhütter, Hermann-Georg

    Cellular metabolism basically consists of the conversion of chemical compounds taken up from the extracellular environment into energy (conserved in energy-rich bonds of organic phosphates) and a wide array of organic molecules serving as catalysts (enzymes), information carriers (nucleic acids), and building blocks for cellular structures such as membranes or ribosomes. Metabolic modeling aims at the construction of mathematical representations of the cellular metabolism that can be used to calculate the concentration of cellular molecules and the rates of their mutual chemical interconversion in response to varying external conditions as, for example, hormonal stimuli or supply of essential nutrients. Based on such calculations, it is possible to quantify complex cellular functions as cellular growth, detoxification of drugs and xenobiotic compounds or synthesis of exported molecules. Depending on the specific questions to metabolism addressed, the methodological expertise of the researcher, and available experimental information, different conceptual frameworks have been established, allowing the usage of computational methods to condense experimental information from various layers of organization into (self-) consistent models. Here, we briefly outline the main conceptual frameworks that are currently exploited in metabolism research.

  16. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    PubMed Central

    Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice

    2017-01-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170

  17. Computational models of neuromodulation.

    PubMed

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  18. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  19. Brands as Intentional Agents Framework: How Perceived Intentions and Ability Can Map Brand Perception.

    PubMed

    Kervyn, Nicolas; Fiske, Susan T; Malone, Chris

    2012-04-01

    Building on the Stereotype Content Model, this paper introduces and tests the Brands as Intentional Agents Framework. A growing body of research suggests that consumers have relationships with brands that resemble relations between people. We propose that consumers perceive brands in the same way they perceive people. This approach allows us to explore how social perception theories and processes can predict brand purchase interest and loyalty. Brands as Intentional Agents Framework is based on a well-established social perception approach: the Stereotype Content Model. Two studies support the Brands as Intentional Agents Framework prediction that consumers assess a brand's perceived intentions and ability and that these perceptions elicit distinct emotions and drive differential brand behaviors. The research shows that human social interaction relationships translate to consumer-brand interactions in ways that are useful to inform brand positioning and brand communications.

  20. Brands as Intentional Agents Framework: How Perceived Intentions and Ability Can Map Brand Perception

    PubMed Central

    Kervyn, Nicolas; Fiske, Susan T.; Malone, Chris

    2013-01-01

    Building on the Stereotype Content Model, this paper introduces and tests the Brands as Intentional Agents Framework. A growing body of research suggests that consumers have relationships with brands that resemble relations between people. We propose that consumers perceive brands in the same way they perceive people. This approach allows us to explore how social perception theories and processes can predict brand purchase interest and loyalty. Brands as Intentional Agents Framework is based on a well-established social perception approach: the Stereotype Content Model. Two studies support the Brands as Intentional Agents Framework prediction that consumers assess a brand’s perceived intentions and ability and that these perceptions elicit distinct emotions and drive differential brand behaviors. The research shows that human social interaction relationships translate to consumer-brand interactions in ways that are useful to inform brand positioning and brand communications. PMID:24403815

  1. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  2. Design of Knowledge Management System for Diabetic Complication Diseases

    NASA Astrophysics Data System (ADS)

    Fiarni, Cut

    2017-01-01

    This paper examines how to develop a Model for Knowledge Management System (KMS) for diabetes complication diseases. People with diabetes have a higher risk of developing a series of serious health problems. Each patient has different condition that could lead to different disease and health problem. But, with the right information, patient could have early detection so the health risk could be minimized and avoided. Hence, the objective of this research is to propose a conceptual framework that integrates social network model, Knowledge Management activities, and content based reasoning (CBR) for designing such a diabetes health and complication disease KMS. The framework indicates that the critical knowledge management activities are in the process to find similar case and the index table for algorithm to fit the framework for the social media. With this framework, KMS developers can work with healthcare provider to easily identify the suitable IT associated with the CBR process when developing a diabetes KMS.

  3. Case analysis online: a strategic management case model for the health industry.

    PubMed

    Walsh, Anne; Bearden, Eithne

    2004-01-01

    Despite the plethora of methods and tools available to support strategic management, the challenge for health executives in the next century will relate to their ability to access and interpret data from multiple and intricate communication networks. Integrated digital networks and satellite systems will expand the scope and ease of sharing information between business divisions, and networked systems will facilitate the use of virtual case discussions across universities. While the internet is frequently used to support clinical decisions in the healthcare industry, few executives rely upon the internetfor strategic analysis. Although electronic technologies can easily synthesize data from multiple information channels, research as well as technical issues may deter their application in strategic analysis. As digital models transform access to information, online models may become increasingly relevant in designing strategic solutions. While there are various pedagogical models available to support the strategic management process, this framework was designed to enhance strategic analysis through the application of technology and electronic research. A strategic analysis framework, which incorporated internet research and case analysis in a strategic managementcourse, is described alongwith design and application issues that emerged during the case analysis process.

  4. What role does performance information play in securing improvement in healthcare? a conceptual framework for levers of change

    PubMed Central

    Levesque, Jean-Frederic; Sutherland, Kim

    2017-01-01

    Objective Across healthcare systems, there is consensus on the need for independent and impartial assessment of performance. There is less agreement about how measurement and reporting performance improves healthcare. This paper draws on academic theories to develop a conceptual framework—one that classifies in an integrated manner the ways in which change can be leveraged by healthcare performance information. Methods A synthesis of published frameworks. Results The framework identifies eight levers for change enabled by performance information, spanning internal and external drivers, and emergent and planned processes: (1) cognitive levers provide awareness and understanding; (2) mimetic levers inform about the performance of others to encourage emulation; (3) supportive levers provide facilitation, implementation tools or models of care to actively support change; (4) formative levers develop capabilities and skills through teaching, mentoring and feedback; (5) normative levers set performance against guidelines, standards, certification and accreditation processes; (6) coercive levers use policies, regulations incentives and disincentives to force change; (7) structural levers modify the physical environment or professional cultures and routines; (8) competitive levers attract patients or funders. Conclusion This framework highlights how performance measurement and reporting can contribute to eight different levers for change. It provides guidance into how to align performance measurement and reporting into quality improvement programme. PMID:28851769

  5. Patient-specific parameter estimation in single-ventricle lumped circulation models under uncertainty

    PubMed Central

    Schiavazzi, Daniele E.; Baretta, Alessia; Pennati, Giancarlo; Hsia, Tain-Yen; Marsden, Alison L.

    2017-01-01

    Summary Computational models of cardiovascular physiology can inform clinical decision-making, providing a physically consistent framework to assess vascular pressures and flow distributions, and aiding in treatment planning. In particular, lumped parameter network (LPN) models that make an analogy to electrical circuits offer a fast and surprisingly realistic method to reproduce the circulatory physiology. The complexity of LPN models can vary significantly to account, for example, for cardiac and valve function, respiration, autoregulation, and time-dependent hemodynamics. More complex models provide insight into detailed physiological mechanisms, but their utility is maximized if one can quickly identify patient specific parameters. The clinical utility of LPN models with many parameters will be greatly enhanced by automated parameter identification, particularly if parameter tuning can match non-invasively obtained clinical data. We present a framework for automated tuning of 0D lumped model parameters to match clinical data. We demonstrate the utility of this framework through application to single ventricle pediatric patients with Norwood physiology. Through a combination of local identifiability, Bayesian estimation and maximum a posteriori simplex optimization, we show the ability to automatically determine physiologically consistent point estimates of the parameters and to quantify uncertainty induced by errors and assumptions in the collected clinical data. We show that multi-level estimation, that is, updating the parameter prior information through sub-model analysis, can lead to a significant reduction in the parameter marginal posterior variance. We first consider virtual patient conditions, with clinical targets generated through model solutions, and second application to a cohort of four single-ventricle patients with Norwood physiology. PMID:27155892

  6. A technology selection framework for supporting delivery of patient-oriented health interventions in developing countries

    PubMed Central

    Chan, Connie V.; Kaufman, David R.

    2009-01-01

    Health information technologies (HIT) have great potential to advance health care globally. In particular, HIT can provide innovative approaches and methodologies to overcome the range of access and resource barriers specific to developing countries. However, there is a paucity of models and empirical evidence informing the technology selection process in these settings. We propose a framework for selecting patient-oriented technologies in developing countries. The selection guidance process is structured by a set of filters that impose particular constraints and serve to narrow the space of possible decisions. The framework consists of three levels of factors: 1) situational factors, 2) the technology and its relationship with health interventions and with target patients, and 3) empirical evidence. We demonstrate the utility of the framework in the context of mobile phones for behavioral health interventions to reduce risk factors for cardiovascular disease. This framework can be applied to health interventions across health domains to explore how and whether available technologies can support delivery of the associated types of interventions and with the target populations. PMID:19796709

  7. Illustrative case using the RISK21 roadmap and matrix: prioritization for evaluation of chemicals found in drinking water.

    PubMed

    Wolf, Douglas C; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R

    2016-01-01

    The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework's roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization.

  8. The influence of social-cognitive factors on personal hygiene practices to protect against influenzas: using modelling to compare avian A/H5N1 and 2009 pandemic A/H1N1 influenzas in Hong Kong.

    PubMed

    Liao, Qiuyan; Cowling, Benjamin J; Lam, Wendy Wing Tak; Fielding, Richard

    2011-06-01

    Understanding population responses to influenza helps optimize public health interventions. Relevant theoretical frameworks remain nascent. To model associations between trust in information, perceived hygiene effectiveness, knowledge about the causes of influenza, perceived susceptibility and worry, and personal hygiene practices (PHPs) associated with influenza. Cross-sectional household telephone surveys on avian influenza A/H5N1 (2006) and pandemic influenza A/H1N1 (2009) gathered comparable data on trust in formal and informal sources of influenza information, influenza-related knowledge, perceived hygiene effectiveness, worry, perceived susceptibility, and PHPs. Exploratory factor analysis confirmed domain content while confirmatory factor analysis was used to evaluate the extracted factors. The hypothesized model, compiled from different theoretical frameworks, was optimized with structural equation modelling using the A/H5N1 data. The optimized model was then tested against the A/H1N1 dataset. The model was robust across datasets though corresponding path weights differed. Trust in formal information was positively associated with perceived hygiene effectiveness which was positively associated with PHPs in both datasets. Trust in formal information was positively associated with influenza worry in A/H5N1 data, and with knowledge of influenza cause in A/H1N1 data, both variables being positively associated with PHPs. Trust in informal information was positively associated with influenza worry in both datasets. Independent of information trust, perceived influenza susceptibility associated with influenza worry. Worry associated with PHPs in A/H5N1 data only. Knowledge of influenza cause and perceived PHP effectiveness were associated with PHPs. Improving trust in formal information should increase PHPs. Worry was significantly associated with PHPs in A/H5N1.

  9. Information-based models for finance and insurance

    NASA Astrophysics Data System (ADS)

    Hoyle, Edward

    2010-10-01

    In financial markets, the information that traders have about an asset is reflected in its price. The arrival of new information then leads to price changes. The `information-based framework' of Brody, Hughston and Macrina (BHM) isolates the emergence of information, and examines its role as a driver of price dynamics. This approach has led to the development of new models that capture a broad range of price behaviour. This thesis extends the work of BHM by introducing a wider class of processes for the generation of the market filtration. In the BHM framework, each asset is associated with a collection of random cash flows. The asset price is the sum of the discounted expectations of the cash flows. Expectations are taken with respect (i) an appropriate measure, and (ii) the filtration generated by a set of so-called information processes that carry noisy or imperfect market information about the cash flows. To model the flow of information, we introduce a class of processes termed Lévy random bridges (LRBs), generalising the Brownian and gamma information processes of BHM. Conditioned on its terminal value, an LRB is identical in law to a Lévy bridge. We consider in detail the case where the asset generates a single cash flow X_T at a fixed date T. The flow of information about X_T is modelled by an LRB with random terminal value X_T. An explicit expression for the price process is found by working out the discounted conditional expectation of X_T with respect to the natural filtration of the LRB. New models are constructed using information processes related to the Poisson process, the Cauchy process, the stable-1/2 subordinator, the variance-gamma process, and the normal inverse-Gaussian process. These are applied to the valuation of credit-risky bonds, vanilla and exotic options, and non-life insurance liabilities.

  10. Metadata mapping and reuse in caBIG.

    PubMed

    Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis

    2009-02-05

    This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.

  11. An interdisciplinary framework for participatory modeling design and evaluation—What makes models effective participatory decision tools?

    NASA Astrophysics Data System (ADS)

    Falconi, Stefanie M.; Palmer, Richard N.

    2017-02-01

    Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.

  12. The Common Patterns of Nature

    PubMed Central

    Frank, Steven A.

    2010-01-01

    We typically observe large-scale outcomes that arise from the interactions of many hidden, small-scale processes. Examples include age of disease onset, rates of amino acid substitutions, and composition of ecological communities. The macroscopic patterns in each problem often vary around a characteristic shape that can be generated by neutral processes. A neutral generative model assumes that each microscopic process follows unbiased or random stochastic fluctuations: random connections of network nodes; amino acid substitutions with no effect on fitness; species that arise or disappear from communities randomly. These neutral generative models often match common patterns of nature. In this paper, I present the theoretical background by which we can understand why these neutral generative models are so successful. I show where the classic patterns come from, such as the Poisson pattern, the normal or Gaussian pattern, and many others. Each classic pattern was often discovered by a simple neutral generative model. The neutral patterns share a special characteristic: they describe the patterns of nature that follow from simple constraints on information. For example, any aggregation of processes that preserves information only about the mean and variance attracts to the Gaussian pattern; any aggregation that preserves information only about the mean attracts to the exponential pattern; any aggregation that preserves information only about the geometric mean attracts to the power law pattern. I present a simple and consistent informational framework of the common patterns of nature based on the method of maximum entropy. This framework shows that each neutral generative model is a special case that helps to discover a particular set of informational constraints; those informational constraints define a much wider domain of non-neutral generative processes that attract to the same neutral pattern. PMID:19538344

  13. Basic Information

    EPA Pesticide Factsheets

    NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey

  14. Integrating human health and environmental health into the DPSIR framework: a tool to identify research opportunities for sustainable and healthy communities.

    PubMed

    Yee, Susan H; Bradley, Patricia; Fisher, William S; Perreault, Sally D; Quackenboss, James; Johnson, Eric D; Bousquin, Justin; Murphy, Patricia A

    2012-12-01

    The U.S. Environmental Protection Agency has recently realigned its research enterprise around the concept of sustainability. Scientists from across multiple disciplines have a role to play in contributing the information, methods, and tools needed to more fully understand the long-term impacts of decisions on the social and economic sustainability of communities. Success will depend on a shift in thinking to integrate, organize, and prioritize research within a systems context. We used the Driving forces-Pressures-State-Impact-Response (DPSIR) framework as a basis for integrating social, cultural, and economic aspects of environmental and human health into a single framework. To make the framework broadly applicable to sustainability research planning, we provide a hierarchical system of DPSIR keywords and guidelines for use as a communication tool. The applicability of the integrated framework was first tested on a public health issue (asthma disparities) for purposes of discussion. We then applied the framework at a science planning meeting to identify opportunities for sustainable and healthy communities research. We conclude that an integrated systems framework has many potential roles in science planning, including identifying key issues, visualizing interactions within the system, identifying research gaps, organizing information, developing computational models, and identifying indicators.

  15. A critique of the Uganda district league table using a normative health system performance assessment framework.

    PubMed

    KirungaTashobya, Christine; Ssengooba, Freddie; Nabyonga-Orem, Juliet; Bataringaya, Juliet; Macq, Jean; Marchal, Bruno; Musila, Timothy; Criel, Bart

    2018-05-10

    In 2003 the Uganda Ministry of Health (MoH) introduced the District League Table (DLT) to track district performance. This review of the DLT is intended to add to the evidence base on Health Systems Performance Assessment (HSPA) globally, with emphasis on Low and Middle Income Countries (LMICs), and provide recommendations for adjustments to the current Ugandan reality. A normative HSPA framework was used to inform the development of a Key Informant Interview (KII) tool. Thirty Key Informants were interviewed, purposively selected from the Ugandan health system on the basis of having developed or used the DLT. KII data and information from published and grey literature on the Uganda health system was analyzed using deductive analysis. Stakeholder involvement in the development of the DLT was limited, including MoH officials and development partners, and a few district technical managers. Uganda policy documents articulate a conceptually broad health system whereas the DLT focuses on a healthcare system. The complexity and dynamism of the Uganda health system was insufficiently acknowledged by the HSPA framework. Though DLT objectives and indicators were articulated, there was no conceptual reference model and lack of clarity on the constitutive dimensions. The DLT mechanisms for change were not explicit. The DLT compared markedly different districts and did not identify factors behind observed performance. Uganda lacks a designated institutional unit for the analysis and presentation of HSPA data, and there are challenges in data quality and range. The critique of the DLT using a normative model supported the development of recommendation for Uganda district HSPA and provides lessons for other LMICs. A similar approach can be used by researchers and policy makers elsewhere for the review and development of other frameworks. Adjustments in Uganda district HSPA should consider: wider stakeholder involvement with more district managers including political, administrative and technical; better anchoring within the national health system framework; integration of the notion of complexity in the design of the framework; and emphasis on facilitating district decision-making and learning. There is need to improve data quality and range and additional approaches for data analysis and presentation.

  16. Actor Interdependence in Collaborative Telelearning.

    ERIC Educational Resources Information Center

    Wasson, Barbara; Bourdeau, Jacqueline

    This paper presents a model of collaborative telelearning and describes how coordination theory has provided a framework for the analysis of actor (inter)dependencies in this scenario. The model is intended to inform the instructional design of learning scenarios, the technological design of the telelearning environment, and the design of…

  17. Towards a Lifecycle Information Framework and Technology in Manufacturing.

    PubMed

    Hedberg, Thomas; Feeney, Allison Barnard; Helu, Moneer; Camelio, Jaime A

    2017-06-01

    Industry has been chasing the dream of integrating and linking data across the product lifecycle and enterprises for decades. However, industry has been challenged by the fact that the context in which data is used varies based on the function / role in the product lifecycle that is interacting with the data. Holistically, the data across the product lifecycle must be considered an unstructured data-set because multiple data repositories and domain-specific schema exist in each phase of the lifecycle. This paper explores a concept called the Lifecycle Information Framework and Technology (LIFT). LIFT is a conceptual framework for lifecycle information management and the integration of emerging and existing technologies, which together form the basis of a research agenda for dynamic information modeling in support of digital-data curation and reuse in manufacturing. This paper provides a discussion of the existing technologies and activities that the LIFT concept leverages. Also, the paper describes the motivation for applying such work to the domain of manufacturing. Then, the LIFT concept is discussed in detail, while underlying technologies are further examined and a use case is detailed. Lastly, potential impacts are explored.

  18. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    USGS Publications Warehouse

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.

  19. SAS- Semantic Annotation Service for Geoscience resources on the web

    NASA Astrophysics Data System (ADS)

    Elag, M.; Kumar, P.; Marini, L.; Li, R.; Jiang, P.

    2015-12-01

    There is a growing need for increased integration across the data and model resources that are disseminated on the web to advance their reuse across different earth science applications. Meaningful reuse of resources requires semantic metadata to realize the semantic web vision for allowing pragmatic linkage and integration among resources. Semantic metadata associates standard metadata with resources to turn them into semantically-enabled resources on the web. However, the lack of a common standardized metadata framework as well as the uncoordinated use of metadata fields across different geo-information systems, has led to a situation in which standards and related Standard Names abound. To address this need, we have designed SAS to provide a bridge between the core ontologies required to annotate resources and information systems in order to enable queries and analysis over annotation from a single environment (web). SAS is one of the services that are provided by the Geosematnic framework, which is a decentralized semantic framework to support the integration between models and data and allow semantically heterogeneous to interact with minimum human intervention. Here we present the design of SAS and demonstrate its application for annotating data and models. First we describe how predicates and their attributes are extracted from standards and ingested in the knowledge-base of the Geosemantic framework. Then we illustrate the application of SAS in annotating data managed by SEAD and annotating simulation models that have web interface. SAS is a step in a broader approach to raise the quality of geoscience data and models that are published on the web and allow users to better search, access, and use of the existing resources based on standard vocabularies that are encoded and published using semantic technologies.

  20. A deep learning framework for modeling structural features of RNA-binding protein targets

    PubMed Central

    Zhang, Sai; Zhou, Jingtian; Hu, Hailin; Gong, Haipeng; Chen, Ligong; Cheng, Chao; Zeng, Jianyang

    2016-01-01

    RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this paper, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https://github.com/thucombio/deepnet-rbp. PMID:26467480

  1. What role does performance information play in securing improvement in healthcare? a conceptual framework for levers of change.

    PubMed

    Levesque, Jean-Frederic; Sutherland, Kim

    2017-08-28

    Across healthcare systems, there is consensus on the need for independent and impartial assessment of performance. There is less agreement about how measurement and reporting performance improves healthcare. This paper draws on academic theories to develop a conceptual framework-one that classifies in an integrated manner the ways in which change can be leveraged by healthcare performance information. A synthesis of published frameworks. The framework identifies eight levers for change enabled by performance information, spanning internal and external drivers, and emergent and planned processes: (1) cognitive levers provide awareness and understanding; (2) mimetic levers inform about the performance of others to encourage emulation; (3) supportive levers provide facilitation, implementation tools or models of care to actively support change; (4) formative levers develop capabilities and skills through teaching, mentoring and feedback; (5) normative levers set performance against guidelines, standards, certification and accreditation processes; (6) coercive levers use policies, regulations incentives and disincentives to force change; (7) structural levers modify the physical environment or professional cultures and routines; (8) competitive levers attract patients or funders. This framework highlights how performance measurement and reporting can contribute to eight different levers for change. It provides guidance into how to align performance measurement and reporting into quality improvement programme. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. A Conceptual Framework for Evaluating the Domains of Applicability of Ecological Models and its Implementation in the Ecological Production Function Library - International Society for Ecological Modelling Conference

    EPA Science Inventory

    The use of computational ecological models to inform environmental management and policy has proliferated in the past 25 years. These models have become essential tools as linkages and feedbacks between human actions and ecological responses can be complex, and as funds for sampl...

  3. Toward an Integrated Gender-Linked Model of Aggression Subtypes in Early and Middle Childhood

    ERIC Educational Resources Information Center

    Ostrov, Jamie M.; Godleski, Stephanie A.

    2010-01-01

    An integrative model is proposed for understanding the development of physical and relational aggression in early and middle childhood. The central goal was to posit a new theoretical framework that expands on existing social-cognitive and gender schema models (i.e., Social Information-Processing Model of Children's Adjustment [N. R. Crick & K. A.…

  4. Open semantic annotation of scientific publications using DOMEO.

    PubMed

    Ciccarese, Paolo; Ocana, Marco; Clark, Tim

    2012-04-24

    Our group has developed a useful shared software framework for performing, versioning, sharing and viewing Web annotations of a number of kinds, using an open representation model. The Domeo Annotation Tool was developed in tandem with this open model, the Annotation Ontology (AO). Development of both the Annotation Framework and the open model was driven by requirements of several different types of alpha users, including bench scientists and biomedical curators from university research labs, online scientific communities, publishing and pharmaceutical companies.Several use cases were incrementally implemented by the toolkit. These use cases in biomedical communications include personal note-taking, group document annotation, semantic tagging, claim-evidence-context extraction, reagent tagging, and curation of textmining results from entity extraction algorithms. We report on the Domeo user interface here. Domeo has been deployed in beta release as part of the NIH Neuroscience Information Framework (NIF, http://www.neuinfo.org) and is scheduled for production deployment in the NIF's next full release.Future papers will describe other aspects of this work in detail, including Annotation Framework Services and components for integrating with external textmining services, such as the NCBO Annotator web service, and with other textmining applications using the Apache UIMA framework.

  5. Open semantic annotation of scientific publications using DOMEO

    PubMed Central

    2012-01-01

    Background Our group has developed a useful shared software framework for performing, versioning, sharing and viewing Web annotations of a number of kinds, using an open representation model. Methods The Domeo Annotation Tool was developed in tandem with this open model, the Annotation Ontology (AO). Development of both the Annotation Framework and the open model was driven by requirements of several different types of alpha users, including bench scientists and biomedical curators from university research labs, online scientific communities, publishing and pharmaceutical companies. Several use cases were incrementally implemented by the toolkit. These use cases in biomedical communications include personal note-taking, group document annotation, semantic tagging, claim-evidence-context extraction, reagent tagging, and curation of textmining results from entity extraction algorithms. Results We report on the Domeo user interface here. Domeo has been deployed in beta release as part of the NIH Neuroscience Information Framework (NIF, http://www.neuinfo.org) and is scheduled for production deployment in the NIF’s next full release. Future papers will describe other aspects of this work in detail, including Annotation Framework Services and components for integrating with external textmining services, such as the NCBO Annotator web service, and with other textmining applications using the Apache UIMA framework. PMID:22541592

  6. The Nature Index: a general framework for synthesizing knowledge on the state of biodiversity.

    PubMed

    Certain, Grégoire; Skarpaas, Olav; Bjerke, Jarle-Werner; Framstad, Erik; Lindholm, Markus; Nilsen, Jan-Erik; Norderhaug, Ann; Oug, Eivind; Pedersen, Hans-Christian; Schartau, Ann-Kristin; van der Meeren, Gro I; Aslaksen, Iulie; Engen, Steinar; Garnåsjordet, Per-Arild; Kvaløy, Pål; Lillegård, Magnar; Yoccoz, Nigel G; Nybø, Signe

    2011-04-22

    The magnitude and urgency of the biodiversity crisis is widely recognized within scientific and political organizations. However, a lack of integrated measures for biodiversity has greatly constrained the national and international response to the biodiversity crisis. Thus, integrated biodiversity indexes will greatly facilitate information transfer from science toward other areas of human society. The Nature Index framework samples scientific information on biodiversity from a variety of sources, synthesizes this information, and then transmits it in a simplified form to environmental managers, policymakers, and the public. The Nature Index optimizes information use by incorporating expert judgment, monitoring-based estimates, and model-based estimates. The index relies on a network of scientific experts, each of whom is responsible for one or more biodiversity indicators. The resulting set of indicators is supposed to represent the best available knowledge on the state of biodiversity and ecosystems in any given area. The value of each indicator is scaled relative to a reference state, i.e., a predicted value assessed by each expert for a hypothetical undisturbed or sustainably managed ecosystem. Scaled indicator values can be aggregated or disaggregated over different axes representing spatiotemporal dimensions or thematic groups. A range of scaling models can be applied to allow for different ways of interpreting the reference states, e.g., optimal situations or minimum sustainable levels. Statistical testing for differences in space or time can be implemented using Monte-Carlo simulations. This study presents the Nature Index framework and details its implementation in Norway. The results suggest that the framework is a functional, efficient, and pragmatic approach for gathering and synthesizing scientific knowledge on the state of biodiversity in any marine or terrestrial ecosystem and has general applicability worldwide.

  7. Investigating nurse practitioners in the private sector: a theoretically informed research protocol.

    PubMed

    Adams, Margaret; Gardner, Glenn; Yates, Patsy

    2017-06-01

    To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.

  8. The Nature Index: A General Framework for Synthesizing Knowledge on the State of Biodiversity

    PubMed Central

    Certain, Grégoire; Skarpaas, Olav; Bjerke, Jarle-Werner; Framstad, Erik; Lindholm, Markus; Nilsen, Jan-Erik; Norderhaug, Ann; Oug, Eivind; Pedersen, Hans-Christian; Schartau, Ann-Kristin; van der Meeren, Gro I.; Aslaksen, Iulie; Engen, Steinar; Garnåsjordet, Per-Arild; Kvaløy, Pål; Lillegård, Magnar; Yoccoz, Nigel G.; Nybø, Signe

    2011-01-01

    The magnitude and urgency of the biodiversity crisis is widely recognized within scientific and political organizations. However, a lack of integrated measures for biodiversity has greatly constrained the national and international response to the biodiversity crisis. Thus, integrated biodiversity indexes will greatly facilitate information transfer from science toward other areas of human society. The Nature Index framework samples scientific information on biodiversity from a variety of sources, synthesizes this information, and then transmits it in a simplified form to environmental managers, policymakers, and the public. The Nature Index optimizes information use by incorporating expert judgment, monitoring-based estimates, and model-based estimates. The index relies on a network of scientific experts, each of whom is responsible for one or more biodiversity indicators. The resulting set of indicators is supposed to represent the best available knowledge on the state of biodiversity and ecosystems in any given area. The value of each indicator is scaled relative to a reference state, i.e., a predicted value assessed by each expert for a hypothetical undisturbed or sustainably managed ecosystem. Scaled indicator values can be aggregated or disaggregated over different axes representing spatiotemporal dimensions or thematic groups. A range of scaling models can be applied to allow for different ways of interpreting the reference states, e.g., optimal situations or minimum sustainable levels. Statistical testing for differences in space or time can be implemented using Monte-Carlo simulations. This study presents the Nature Index framework and details its implementation in Norway. The results suggest that the framework is a functional, efficient, and pragmatic approach for gathering and synthesizing scientific knowledge on the state of biodiversity in any marine or terrestrial ecosystem and has general applicability worldwide. PMID:21526118

  9. Beacon communities' public health initiatives: a case study analysis.

    PubMed

    Massoudi, Barbara L; Marcial, Laura H; Haque, Saira; Bailey, Robert; Chester, Kelley; Cunningham, Shellery; Riley, Amanda; Soper, Paula

    2014-01-01

    The Beacon Communities for Public Health (BCPH) project was launched in 2011 to gain a better understanding of the range of activities currently being conducted in population- and public health by the Beacon Communities. The project highlighted the successes and challenges of these efforts with the aim of sharing this information broadly among the public health community. The Beacon Community Program, designed to showcase technology-enabled, community-based initiatives to improve outcomes, focused on: building and strengthening health information technology (IT) infrastructure and exchange capabilities; translating investments in health IT to measureable improvements in cost, quality, and population health; and, developing innovative approaches to performance measurement, technology, and care delivery. Four multimethod case studies were conducted based on a modified sociotechnical framework to learn more about public health initiative implementation and use in the Beacon Communities. Our methodological approach included using document review and semistructured key informant interviews. NACCHO Model Practice Program criteria were used to select the public health initiatives included in the case studies. Despite differences among the case studies, common barriers and facilitators were found to be present in all areas of the sociotechnical framework application including structure, people, technology, tasks, overarching considerations, and sustainability. Overall, there were many more facilitators (range = 7-14) present for each Beacon compared to barriers (range = 4-6). Four influential promising practices were identified through the work: forging strong and sustainable partnerships; ensuring a good task-technology fit and a flexible and iterative design; fostering technology acceptance; and, providing education and demonstrating value. A common weakness was the lack of a framework or model for the Beacon Communities evaluation work. Sharing a framework or approach to evaluation at the beginning of implementation made the work more effective. Supporting evaluation to inform future implementations is important.

  10. Exploiting the information revolution: call for independent evaluation of the latest English national experiment.

    PubMed

    Scott, Philip

    2015-02-19

    The English National Health Service (NHS) has a long history of national experiments with information technology; some successful, others less so. The NHS England Five Year Forward View aspires to 'Exploit the information revolution' through the transformational work of the National Information Board (NIB). NIB has published a 'Framework for Action' that promotes citizen empowerment, information availability, transparency, public trust, innovation, informatics skills and societal value. The framework sets out many laudable and common sense ambitions, but is light on evidence to support its aspirations, or plans for its evaluation. Considerable resource was invested in evaluation studies in the later stages of the National Programme for IT in England, but the analyses do not seem to have been included. Most of the cited evidence is from an unpublished report by management consultants rather than independent peer-reviewed work. National experiments of this importance should be evidence based and properly evaluated so that each iteration of 'information revolution' produces an evidence base to inform subsequent generations of care provision models and technology innovation. Evaluation should be planned from the very start rather than added in as an afterthought. Like any good business plan there should be defined critical success factors for health and social care, and a declaration of how they might be measured. Unintended consequences should be qualitatively explored. Evaluation should also consider critical-interpretive social perspectives to understand the human factors in technology deployment and should seek a theoretically informed insight into the mechanisms of change. The NHS Five Year Forward View and the NIB framework set out challenging ambitions based on transformative use of information technology and collaborative partnerships with commissioners and providers. The framework stands up very well against the Hayes principles, but the learning opportunities associated with this programme should not be missed.

  11. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    PubMed

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods.

  12. Understanding Global Change (UGC) as a Unifying Conceptual Framework for Teaching Ecology: Using UGC in a High School Biology Program to Integrate Earth Science and Biology, and to Demonstrate the Value of Modeling Global Systems in Promoting Conceptual Learning

    NASA Astrophysics Data System (ADS)

    Levine, J.; Bean, J. R.

    2017-12-01

    Global change science is ideal for NGSS-informed teaching, but presents a serious challenge to K-12 educators because it is complex and interdisciplinary- combining earth science, biology, chemistry, and physics. Global systems are themselves complex. Adding anthropogenic influences on those systems creates a formidable list of topics - greenhouse effect, climate change, nitrogen enrichment, introduced species, land-use change among them - which are often presented as a disconnected "laundry list" of "facts." This complexity, combined with public and mass-media scientific illiteracy, leaves global change science vulnerable to misrepresentation and politicization, creating additional challenges to teachers in public schools. Ample stand-alone, one-off, online resources, many of them excellent, are (to date) underutilized by teachers in the high school science course taken by most students: biology. The Understanding Global Change project (UGC) from the UC Berkeley Museum of Paleontology has created a conceptual framework that organizes, connects, and explains global systems, human and non-human drivers of change in those systems, and measurable changes in those systems. This organization and framework employ core ideas, crosscutting concepts, structure/function relationships, and system models in a unique format that facilitates authentic understanding, rather than memorization. This system serves as an organizing framework for the entire ecology unit of a forthcoming mainstream high school biology program. The UGC system model is introduced up front with its core informational graphic. The model is elaborated, step by step, by adding concepts and processes as they are introduced and explained in each chapter. The informational graphic is thus used in several ways: to organize material as it is presented, to summarize topics in each chapter and put them in perspective, and for review and critical thinking exercises that supplement the usual end-of-chapter lists of key terms.

  13. The criterion of subscale sufficiency and its application to the relationship between static capillary pressure, saturation and interfacial areas

    PubMed Central

    2016-01-01

    Modern imaging techniques, increased simulation capabilities and extended theoretical frameworks, naturally drive the development of multiscale modelling by the question: which new information should be considered? Given the need for concise constitutive relationships and efficient data evaluation; however, one important question is often neglected: which information is sufficient? For this reason, this work introduces the formalized criterion of subscale sufficiency. This criterion states whether a chosen constitutive relationship transfers all necessary information from micro to macroscale within a multiscale framework. It further provides a scheme to improve constitutive relationships. Direct application to static capillary pressure demonstrates usefulness and conditions for subscale sufficiency of saturation and interfacial areas. PMID:27279769

  14. "When information is not enough": A model for understanding BRCA-positive previvors' information needs regarding hereditary breast and ovarian cancer risk.

    PubMed

    Dean, Marleah; Scherr, Courtney L; Clements, Meredith; Koruo, Rachel; Martinez, Jennifer; Ross, Amy

    2017-09-01

    To investigate BRCA-positive, unaffected patients' - referred to as previvors - information needs after testing positive for a deleterious BRCA genetic mutation. 25 qualitative interviews were conducted with previvors. Data were analyzed using the constant comparison method of grounded theory. Analysis revealed a theoretical model of previvors' information needs related to the stage of their health journey. Specifically, a four-stage model was developed based on the data: (1) pre-testing information needs, (2) post-testing information needs, (3) pre-management information needs, and (4) post-management information needs. Two recurring dimensions of desired knowledge also emerged within the stages-personal/social knowledge and medical knowledge. While previvors may be genetically predisposed to develop cancer, they have not been diagnosed with cancer, and therefore have different information needs than cancer patients and cancer survivors. This model can serve as a framework for assisting healthcare providers in meeting the specific information needs of cancer previvors. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A framework for modeling scenario-based barrier island storm impacts

    USGS Publications Warehouse

    Mickey, Rangley; Long, Joseph W.; Dalyander, P. Soupy; Plant, Nathaniel G.; Thompson, David M.

    2018-01-01

    Methods for investigating the vulnerability of existing or proposed coastal features to storm impacts often rely on simplified parametric models or one-dimensional process-based modeling studies that focus on changes to a profile across a dune or barrier island. These simple studies tend to neglect the impacts to curvilinear or alongshore varying island planforms, influence of non-uniform nearshore hydrodynamics and sediment transport, irregular morphology of the offshore bathymetry, and impacts from low magnitude wave events (e.g. cold fronts). Presented here is a framework for simulating regionally specific, low and high magnitude scenario-based storm impacts to assess the alongshore variable vulnerabilities of a coastal feature. Storm scenarios based on historic hydrodynamic conditions were derived and simulated using the process-based morphologic evolution model XBeach. Model results show that the scenarios predicted similar patterns of erosion and overwash when compared to observed qualitative morphologic changes from recent storm events that were not included in the dataset used to build the scenarios. The framework model simulations were capable of predicting specific areas of vulnerability in the existing feature and the results illustrate how this storm vulnerability simulation framework could be used as a tool to help inform the decision-making process for scientists, engineers, and stakeholders involved in coastal zone management or restoration projects.

  16. Finding Family Facts in the Digital Age: Family History Research and Production Literacies

    ERIC Educational Resources Information Center

    Willever-Farr, Heather Lynn

    2017-01-01

    This study examines the online information behaviors of experienced and novice family history researchers, though the lens of accuracy and an increasingly digital research and production environment. It presents a model of the information behaviors of family history researchers, as well as a literacies framework, which visualizes the skills and…

  17. Towards Improved Student Experiences in Service Learning in Information Systems Courses

    ERIC Educational Resources Information Center

    Petkova, Olga

    2017-01-01

    The paper explores relevant past research on service-learning in Information Systems courses since 2000. One of the conclusions from this is that most of the publications are not founded on specific theoretical models and are mainly about sharing instructor or student experiences. Then several theoretical frameworks from Education and other…

  18. Pre-Service Teachers' Attitude towards Information and Communication Technology Usage: A Ghanaian Survey

    ERIC Educational Resources Information Center

    Gyamfi, Stephen Adu

    2017-01-01

    This study employed the Technology Acceptance Model (TAM) to empirically investigate factors that influence Ghanaian pre-service teachers' attitudes towards Information and Communication Technology (ICT) usage. To achieve this aim, the study extended the TAM framework by adding leadership support and job relevance as exogenous variables. Data were…

  19. A Model for an Information Security Risk Management (ISRM) Framework for Saudi Arabian Organisations

    ERIC Educational Resources Information Center

    Alshareef, Naser

    2016-01-01

    Countries in the Gulf represent thriving, globally important commercial centres. They have embraced technology and modern management methods, often originating in the western countries. In adapting to quite different cultures these do not always operate as successfully. The adoption and practices of the Information Security Risk Management (ISRM)…

  20. Metadata Design in the New PDS4 Standards - Something for Everybody

    NASA Astrophysics Data System (ADS)

    Raugh, Anne C.; Hughes, John S.

    2015-11-01

    The Planetary Data System (PDS) archives, supports, and distributes data of diverse targets, from diverse sources, to diverse users. One of the core problems addressed by the PDS4 data standard redesign was that of metadata - how to accommodate the increasingly sophisticated demands of search interfaces, analytical software, and observational documentation into label standards without imposing limits and constraints that would impinge on the quality or quantity of metadata that any particular observer or team could supply. And yet, as an archive, PDS must have detailed documentation for the metadata in the labels it supports, or the institutional knowledge encoded into those attributes will be lost - putting the data at risk.The PDS4 metadata solution is based on a three-step approach. First, it is built on two key ISO standards: ISO 11179 "Information Technology - Metadata Registries", which provides a common framework and vocabulary for defining metadata attributes; and ISO 14721 "Space Data and Information Transfer Systems - Open Archival Information System (OAIS) Reference Model", which provides the framework for the information architecture that enforces the object-oriented paradigm for metadata modeling. Second, PDS has defined a hierarchical system that allows it to divide its metadata universe into namespaces ("data dictionaries", conceptually), and more importantly to delegate stewardship for a single namespace to a local authority. This means that a mission can develop its own data model with a high degree of autonomy and effectively extend the PDS model to accommodate its own metadata needs within the common ISO 11179 framework. Finally, within a single namespace - even the core PDS namespace - existing metadata structures can be extended and new structures added to the model as new needs are identifiedThis poster illustrates the PDS4 approach to metadata management and highlights the expected return on the development investment for PDS, users and data preparers.

Top